Skip to main content

Full text of "The Belmont report : ethical principles and guidelines for the protection of human subjects of research"

See other formats


Appendix 
Volume II 



The 

Belmont 

Report 

Ethical Principles 

and Guidelines for 

the Protection of 

Human Subjects 

of Research 



The National Commission 

for the Protection of Human Subjects 

of Biomedical and Behavioral 

Research 



V 



Appendix 
Volume II 



The 

Belmont 

Report 

Ethical Principles 

and Guidelines for 

the Protection of 

Human Subjects 

of Research 



The National Commission 

for the Protection of Human Subjects 

of Biomedical and Behavioral 

Research 



This Appendix contains (in two volumes) 

the full text of the papers that were prepared 

to assist the Commission in its consideration 

of the basic ethical principles that should 

underlie the conduct of research 

involving human subjects. 



DHEW Publication No. (OS) 78-0014 



For sale by tho Superintendent of Documents, U.S. Government Printing Office, Washington, D.C. 20402 






V, <£ 



TABLE OF CONTENTS 
APPENDIX TO BELMONT REPORT 
Volumes I and II 
Volume II 

I. PRELIMINARY PAPERS PREPARED FOR THE COMMISSION 
BY ROBERT J, LEVINE, M.D. 



1. The Boundaries Between Biomedical or 

Behavioral Research and the Accepted 
and Routine Practice of Medicine 

2. The Role of Assessment of Risk Benefit 

Criteria in the Determination of the 
Appropriateness of Research Involving 
Human Subjects 

3. The Nature and Definition of Informed 

Consent in Various Research Settings 

4. Appropriate Guidelines for the Selection 

of Human Subjects for Participation in 
Biomedical and Behavioral Research 



II. BASIC ETHICAL PRINCIPLES RELATING TO RESEARCH 
INVOLVING HUMAN SUBJECTS 



5. Ethical Principles and Their Validity Kurt Baier, D. Phil. 

6. Distributive Justice and Morally Relevant 

Differences ,,,.., Tom Beauchamp, Ph.D. 

7. The Identification of Ethical Principles James Childress, B.D., Ph.D. 

8. Basic Ethical Principles in the Conduct of 

Biomedical and Behavioral Research 

Involving Human Subjects H. Tristram Engelhardt, 

Jr., Ph.D., M.D. 

9. Medical Ethics and the Architecture of 

Clinical Research Alvan R. Feinstein, M.D. 

Jeffrey L. Lichtenstein, 
M.D. 

10. How "to Identify Ethical Principles Alasdair Maclntyre, M.A. 



Belmont Appendix 



11. Some Ethical Issues in Research Involving 
Human Subjects 



LeRoy Walters, B.D., 
Ph.D. 



Volume II 
III. BOUNDARIES BETWEEN RESEARCH AND PRACTICE 



12. Protection of the Rights and Interests 

of Human Subjects in the Areas of Pro- 
gram Evaluation, Social Experimenta- 
tion, Social Indicators, Survey Re- 
search, Secondary Analysis of Research 
Data, and Statistical Analysis of Data 

From Administrative Records Donald T. Campbell, Ph.D. 

Joe Shelby Cecil, Ph.D. 

13. Response to Commission Duties as Detailed 

in P.L. 93-348, Sec. 202(a)(l )(B)(i ) Donald Gallant, M.D. 

14. On the Usefulness of Intent for Distinguishing 

Between Research and Practice, and Its Replace- 
ment by Social Contingency Israel Goldiamond, Ph.D. 

15. Boundaries Between Research and Therapy, , 

Especially in Mental Health Perry London, Ph.D. 

Gerald Klerman, M.D. 



16. 



17. 



18. 



Legal Implications of the Boundaries 
Between Biomedical Research Involving 
Human Subjects and the Accepted or 
Routine Practice of Medicine 



John Robertson, J.D, 



The Boundaries Between Biomedical Re- 
search Involving Human Subjects and 
the Accepted or Routine Practice of 
Medicine, with Particular Emphasis on 
Innovation in the Practice of Surgery 



What Problems are Raised When the Current 
DHEW Regulation on Protection of Human 
Subjects is Applied to Social Science 
Research? 



David Sabiston, M.D. 



Richard A. Tropp 



Belmont Appendix 

IV. RISK/BENEFIT CRITERIA 



19. Some Perspectives on the Role of Assess- 

ment of Risk/Benefit Criteria in the 

Determination of the Appropriateness of 

Research Involving Human Subjects , Bernard Barber, Ph.D. 

20. The Role of Risk/Benefit Analysis in the 

Conduct of Psychological Research Gregory Kimble, Ph.D. 

21. A Philosophical Perspective on the Assess- 

ment of Risk-Benefit Criteria in Connection 

with Research Involving Human Subjects Maurice Natanson, Ph.D. 

22. Essay on Some Problems of Risk-Benefit 

Analysis in Clinical Pharmacology ........ Lawrence C. Raisz, M.D, 



V, INFORMED CONSENT 



23. Nature and Definition of Informed 

Consent in Research Involving Deception Diana Baumrino, ?h.D. 

24. Some Complexities and Uncertainties 

Regarding the Ethical ity of Deception 

in Research with Human Subjects , , Leonard Berkowitz, Ph.D, 

25. Selected Issues in Informed Consent and 

Confidentiality with Special Reference 
to Behavioral/Social Science Research/ 
Inquiry Albert Reiss, Jr., Ph.D. 

26. Three Theories of Informed Consent: 

Philosophical Foundations and Policy 

Implications , Robert Veatch. Ph.D. 



Ill 

BOUNDARIES BETWEEN RESEARCH AND PRACTICE 



12 
PROTECTION OF THE RIGHTS AND INTERESTS 
OF HUMAN SUBJECTS IN THE AREAS OF PROGRAM EVALUATION, 
SOCIAL EXPERIMENTATION, SOCIAL INDICATORS, 
SURVEY RESEARCH, SECONDARY ANALYSIS OF 
RESEARCH DATA, AND STATISTICAL ANALYSIS OF DATA 
FROM ADMINISTRATIVE RECORDS 

Donald T. Campbell , Ph.D. 

and 
Joe Shelby Cecil , Ph.D. 



Protection of the Rights and Interests of Human Subjects in the Areas 
of Program Evaluation, Social Experimentation, Social Indicators, 
Survey Research, Secondary Analysis of Research Data, and 
Statistical Analysis of Data From Administrative Records 

Donald T. Campbell and Joe Shelby Cecil 

Northwestern University 



An important task facing the National Commission for the Protection of 
Human Subjects of Biomedical and Behavioral Research is the establishment 
of standards for the burgeoning new areas of program evaluation, social in- 
dicators, and related activities (to be collectively designated "program 
evaluation" in this manuscript unless greater specificity is needed) . All 
of these activities are "research" (usually behavioral research) in the 
sense of Public Law 93-348; thus they fall within the scope of the commission's 
assignments. As Institutional Review Boards become increasingly involved in 
approving such research, they could benefit from guidelines prepared by the 
NCPHSBBR for this novel set of problems. 

While the participants in such research clearly have rights and interests 
which may be violated, the nature of these threats is somewhat unique. Rarely 
will risk to physical health be involved. Indeed, the experimental group par- 
ticipants often receive an apparent boon which the control group participants 
may well feel they equally deserve, so that control group rights may often be 
the greater problem. The more frequent danger in program evaluation is the 
risk that the research data will be misused since sensitive information is 
often collected. Such data may be subpoenaed by prosecutors searching for 
evidence of crimes, or become a source of malicious gossip or blackmail. 
Federally funded program evaluations frequently require auditing, verifica- 
tion, and reanalysis. These activities may preclude a promise of complete 
confidentiality to the respondents and increase the risk that the informa- 
tion they provide will be used improperly. However, if respondents are fully 
informed of these risks, the quality of the research data may be diminished. 
From these few examples it is apparent that these areas of social research 
present a different set of problems from those encountered in medical and 
laboratory research. 

This problem area has already received attention from several national 
organizations. For instance, the Social Science Research Council's Committee 
on Social Experimentation considered these issues at length over a four-year 
period, producing a short chapter on "Human Values and Social Experimentation" 
(Riecken, Boruch, et al. , 1974, pp. 245-269). The contemporaneous National 
Academy of Science - National Research Council "Committee on Federal Agency 
Evaluation Research" addressed these issues in its report entitled Protecting 
Individual Privacy in Evaluation Research (Rivlin, et al. , 1975). (One of the 
present authors participated in both of these committees.) The Privacy Protec- 
tion Study Commission, established by the Privacy Act of 19 74, has extensively 
considered the problem of maintaining confidentiality of research information 
(Notice of Hearing and Draft Recommendations: Research and Statistics, January 
6, 1977). The Social Science Research Council has a longstanding committee 



12-1 



and special staff devoted to Social Indicators, and is establishing a new 
committee on program evaluation. The Brookings Panel on Social Experimenta- 
tion recently published a series of papers on this topic (Rivlin and Timpane, 
1975) . Special committees with this concern exist in many professional organ- 
izations. This recent activity provides the National Commission with a unique 
opportunity to integrate these diverse findings into a general code protecting 
the rights of subjects participating in these new areas of research. 

Background Comments : 

Like the others who have agreed to write background papers for the 
Commission, the present writers have volunteered to do so because of strong 
concerns on this matter. In these areas of research, two widely cherished 
values are in potential conflict. The subject's right of privacy may conflict 
with the researcher's need to gather sensitive information necessary for mean- 
ingful program evaluation. We wish to make explicit our manner of resolving 
this conflict. In agreement with the dominant mood in Washington, we recognize 
the right to privacy of individuals participating in these areas of research. 
This paper includes several suggestions which would result in increased pro- 
tection for the privacy of research participants. However, our greater fear 
is that Congress and the administration will needlessly preclude important 
program evaluation and access to research information through ill-considered 
efforts to protect individual privacy. For example, special procedures of file 
linkage permit inexpensive and highly relevant program evaluation. Although 
these procedures require the retrieval of administrative records, they may be 
employed without jeopardizing the privacy of program participants. (The case 
for such procedures will be presented in the context of specific recommenda- 
tions.) We urge that special caution be exercised to avoid creating rules 
that unnecessarily restrict these procedures. 

Before providing our recommendations we wish to set the scope of this 
report by defining some of the major terms that will be employed: 

Program Evaluation : Assembly of evidence bearing on the effectiveness 
and side effects of ameliorative programs, social innovations, etc. These 
programs have usually been initiated by governments. 

Social Indicators : Statistical summaries, often in time-series form, 
bearing on the well-being of the nation or smaller social units. Social 
indicators may be viewed in contrast to more common economic indicators. 
Many social indicators are generated from statistical summaries of adminis- 
trative records. Others, such as indicators based on the Census, are produced 
by institutionalized survey procedures. Increasing attention is being given 
to "subjective" social indicators, in which representative samples of the 
public report on their "happiness" or satisfaction with various aspects of 
their lives in public opinion surveys. 

Social Experimentation : This will be narrowly defined, as it was in the 
SSRC volume (Riecken, et al., 1974), to refer to an experimental form of policy 
research and/or program evaluation, experiments carried out in social (as op- 
posed to laboratory) settings evaluating governmental or other social inter- 
ventions. (This definition excludes experiments in public settings to test 
social science theories, an important form of social experimentation that the 
National Commission is attending to through other background papers.) 



12-2 



Respondents : Participants, interviewees, anthropological "informants," 
the persons whose responses are recorded, the "subjects" of research, etc. 
Many social scientists prefer the terms "respondent" or "participant" to the 
term "subject," since the term "subject" has been associated with an exploi- 
tative attitude neglecting the rights and interests of the research cooperator. 

Statistical Data: The Privacy Act of 1974 uses this term to refer to 
information collected originally for research rather than administrative 
purposes. This usage will be avoided here in favor of research data . 

Statistical Analysis , Statistical Product , and Statistic : These terms 
refer to summary indices no longer containing individually identifiable data 
that may be based on either research data or administrative records. Means, 
standard deviations, correlation coefficients, t ratios, F ratios, probabil- 
ity levels, etc., exemplify statistical products. Frequency counts and per- 
centages usually qualify as statistical products precluding individual identi- 
fication, but not if the identities of individuals can be deduced through as- 
sociation of research data with public records. 

Administrative Records : Refer to data collected originally for bureau- 
cratic purposes rather than research purposes. School grades, achievement 
test scores, earnings subject to withholding tax, unemployment insurance pay- 
ments, days hospitalized, incidence of serum hepatitis, auto insurance claims, 
all represent administrative records that can be of great value in program 
evaluation if they are used in ways safeguarding individual privacy. 

Record , File , Data Bank : These are terms used for collections of data 
on individuals, either administrative or research data. 

Re analysis and Data Analysis by Outsiders : Refer to the use of research 
data or administrative records for purposes other than were originally under- 
stood by the respondents, and by persons other than the regular custodians of 
the data. 

File Merging : Refers to combining individual data from two files contain- 
ing data about the same respondents, so that one or both of the files, or a 
third file, ends up containing individually identified data originating in 
another file. Unified data banks involve file merging. 

File Linkage : Refers to linking data from two or more files so that 
statistical products are generated involving data from both files. File 
merging is the most complete form of file linkage, and where permissible, the 
most statistically efficient. It is important to note, however, that there 
are restricted forms of file linkage that do not involve file merger, and 
where no individually identified data are transferred from any file to any 
other (e.g., the "mutually insulated" file linkage to be discussed below). 

Recommendations : 

1. Review and Review Boards 

Let us start with a concrete recommendation: 



12-3 



la. Evaluation research, social indicator research, social 
survey research, secondary analysis of research data, and statistical 
analysis of data from administrative records, are to conform to rights 
of subjects legislation (in particular, PL93-348) and to the guidelines 
and regulations developed to implement these laws by the National 
Commission for the 'Protection of Human Subjects of Biomedical and 
Behavioral Research. This coverage includes all such research regard- 
less of auspices or funding: private, unfunded, university -related, 
profit and nonprofit research groups, research by governmental 
employees, etc. 

There is general agreement that these areas of research are and should 
be covered by PL93-348 and other rights-of-subjects legislation. Probably 
99% of such research already is conforming to such standards in the sense of 
not violating the rights-of-subjects specified. There are essentially no 
publicized cases of violations in these areas. The problem raised by PL93-348 
is the monstrous bureaucratic burden of requiring this vast area of low-risk 
research to go through formal institutional review processes. (See the two Ap- 
pendices that present reactions to an earlier draft of this report.) In re- 
sponse to this problem, we are suggesting a process of conditional clearance 
by affidavit . This procedure provides an expeditious means of reviewing 
certain low-risk research areas. Sample verification, such as is done for 
income tax reports, and the threat of subsequent prosecution for actions in 
violation of the clearance affidavit should discourage abuses. The suggested 
procedure will be superior to the kind of mass-produced perfunctory clearance 
that institutional Review Boards would tend to employ in these areas. If 
affidavit clearance requires a revision of PL93-348, or other laws, we recom- 
mend such revisions be enacted. 

lb. Rights of Subjects Clearance Procedures: Conditional Clearance 
by Affidvait and Full Review by Institutional Review Boards . Before 
soliciting funding or initiating a research activity in the low-risk 
areas of evaluation research, social experimentation, social indicator 
research, social survey research, secondary analysis of research data, 
or statistical analysis of data from administrative records, the 
Principal Investigator (s) should file with the Institutional Review 
Board concerned with protecting the rights of the participants in the 
planned study, a full research proposal and a "clearance affidavit," 
constituting a detailed affirmation that the rights of the participants 
and subjects are not jeopardized in any of the ways specified by the 
National Commission for the Protection of Human Subjects of Biomedical 
and Behavioral Research in implementing PL93-348. At the discretion of 
the Review Board and the request of the Principal Investigator, this 
affidavit may constitute a conditional rights-of-subjects clearance, 
permitting funding requests and research to proceed forthwith, unless 
or until the Principal Investigator, the Institutional Review Board, 
or the funding source, requests delay for a full review by the Insti- 
tutional Review Board. The Institutional Review Board may conduct such 
a full review at any time during a research proceeding under conditional 
affidavit clearance , and may order the cessation of research found to be 
violating rights-of-subjects regulations. 



We env 
risk areas 



isage this conditional clearance by affidavit, for these low- 

of research, being implemented through a long, detailed question- 



12-4 



naire that the Principal Investigator (s) would fill out, sign, and have notar- 
ized. The contents of the questionnaire would be based on the rules, issues, 
and guidelines that the National Commission on Protection of Human Subjects of 
Biomedical and Behavioral Research is now developing, including regulations 
such as those suggested below. These affidavits and research proposals would 
be kept on file by the Review Board for the length of the research project and 
the subsequent period of project liability for participant injury. For these 
designated low-risk areas, the funding and/or research process could proceed 
as soon as the proposal and clearance affidavit were filed, if the Principal 
Investigator (s) had affirmed it as lacking in participant jeopardies and did 
not wish a Board review. The Board would have the right to examine these on 
a spot-check, sample, or systematic basis, and to request at any point the 
cessation of activity (funding applications, data collection, data analysis, 
etc.) until a Board clearance had been achieved. For these low-risk areas 
such a delayed decision to hold full review or a veto of the research would 
be rare, and it would be upon such an estimate and understanding of the regu- 
lations that a principal investigator would opt for conditional affidavit clear- 
ance rather than requesting a full Board review. Certainly a Board would want 
to have a staff or Board member examine oach affidavit for combinations of 
features that might indicate possible risks. Since sampling is an efficient 
technique for quality control, perhaps a Board should give full review to a 
random one-tenth of conditional affidavit clearances. 

From the investigator's point of view, affidavit clearance prolongs 
the project's vulnerability to a negative Review Board decision and may in- 
crease its liability to legal damage claims brought ^gainst it by participants. 
The relative advantage of prior Board clearance may easily be overestimated 
however. Even for projects they have approved, Review Boards will want the 
right to determine that the project is restricting itself to the approved 
activities, and will use that right if it receives complaints. 

Consideration should be given to the effects of including program evalua- 
tion, etc., on the constitution of Review Boards. This raises a number of 
problems that were not fully presented in the initial draft of this paper and 
thus have not received comments . One recommendation is obvious : 

lc. Rights-of -Subjects Review Boards should be available to 
handle -program evaluation, etc. , on research done by independent 
investigators , profit and nonprofit research organizations , governmental 
agencies, etc., as well as for research conducted through universities. 

Note that while the Statistical Policy Division of the Office of 
Management and Budget reviews questionnaire forms for governmental and govern- 
ment contract research, and may consider respondents' rights in the process, 
this does not necessarily provide the equivalent of Institutional Review Board 
clearance. 

The proper location of these Review Boards becomes a problem. It would 
be desirable for them to be locally available to the research participants so 
that complaints can easily be placed and heard. This role for Review Boards 
becomes particularly important in monitoring the conditional affidavit clear- 
ance procedure . 



12-5 



To date, Institutional Review Boards have been set up in the institutions 
doing the research. Since most of this research has been conducted in univers- 
ities and hospitals, the participants in such research have had easy access to 
the Board. However, a program evaluation may be conducted by a more distant 
institution. Thus local institutions (such as public schools) whose members 
are frequent subjects of evaluation research may wish to set up their own 
Rights-of-Subjects Review Boards. 

Local Review Boards seem impractical for broad public opinion surveys. 
While city, county, and state boards are conceivable, and should be given juris- 
diction if they request it (local jurisdictions that require licensing of opin- 
ion survey interviewers could insist on approval by Review Board*), it would be 
unreasonable to require local Review Boards for national surveys interviewing 
only a few people in any one local jurisdiction. For these, a national Review 
Board is necessary. 

Enforcement of the review requirement will be most effective when tied to 
funding. This suggests that each major source of funding, government and pri- 
vate, set up review boards. While some commercial and private political opin- 
ion research may avoid review, this may be the practical limit of the enforce- 
ment power. Opinion survey interviewing merges into investigative interviewing 
by journalists, detective work, credit investigation, neighborly curiosity, and 
intelligence activities more generally. It is in these areas that Rights-of- 
Subjects are in the most jeopardy (persons interviewed about as well as persons 
interviewed) yet we are unlikely to see such "research" activities subject to 
Rights-of-Subjects scrutiny. 

Id. Where there are several Institutional Review Board ap- 
propriate, one review is sufficient if the Review Board most directly 
responsible for the well-being of the respondents does the review or 
formally concurs in the review. 

Research by a university team on hospital patients would provide one 
example. In such a case, the hospital has the primary responsiblity for 
the well-being of the participants. If a community drug abuse abatement agency 
required data from high school students to be collected through the schools, 
and if the school district had a Review Board, it would be the one with the 
primary responsibility for protecting respondent rights. 

To adequately protect research participants' rights, it would seem es- 
sential for the participants to know the extent of their rights and where to 
complain if they feel their rights are in jeopardy. Fully informed research 
participants will be necessary to monitoring the conduct of research approved 
under the conditional clearance procedure: 

la. Research participants should each be given a printed statement 
informing them that the research is being conducted in conformity with 
Congressional legislation on the rights-of-subjects, the extent of their 
rights under this legislation, and providing the address and telephone 
number of the Review Board to whom complaints should be directed. 

In the case of a national Review Board, this might include a Toli- 
Free 800 area code number. This recommendation is one of several that could 
be implemented with a statement in writing that could be left with the respon- 
dent. 

12-6 



Does the inclusion of program evaluation, survey research, etc., have any 
special implications for the selection of Review Board members? A recommenda- 
tion characteristic of these areas of research would be that Review Boards 
contain members of the groups from which participants are being drawn, or, in 
the case of children, parents of such participants. Such suggestions arise 
out of experience with ghetto neighborhood boycotts of survey research. It is 
probably generally true that on these research topics potential participants 
are more competent to judge when their own interests are threatened than in the 
case of medical research. A brief training program could supply what tech- 
nical knowledge would be necessary to make an informed judgment. While we 
concur in the desirability of having such persons on Boards, along with sub- 
stantial proportions of nonresearchers , we have been unable to develop a recom- 
mendation that would insure such representation and still be feasible. It is 
difficult to develop a method that would insure representation of the interests 
of the members of the community while limiting the intrusion of narrow politi- 
cal issues into the review process. If such community representatives were 
given veto power, this would in effect recognize class or category rights, 
which is recommended against in section 7. 

2. The Borderline Between Administrative Reports on Social Service Delivery 
and Program Evaluation 

There is a problem of borderlines between a social work department de- 
livering its regular services and a similar department testing out new pro- 
cedures or giving a special evlauation to its standard method of operation. 
Similarly, there is a borderline between the regular instructional activities 
in a school and the comparative evaluation of alternative practices. Thus 
parallels exist to the troublesome problem the Commission faces with regard 
to medical practice: When does the doctor's exploration of alternative thera- 
pies with his patient become research? While the Commission should take some 
cognizance of the borderlines for program evaluation, these problems seem less 
serious than those in medical research, and it is probably wise to employ a 
narrow definition of program evaluation to minimize the coverage. (For 
cautions and dissentions on this, as related to specific recommendations to 
follow, see the Appendix A, reactions to points 5-8.) 

Social service programs, employment offices, adult education programs, 
schools, police departments, administrative agencies of all kinds, have in 
the past had wide latitude in varying their modes of operation. It would seem 
unwise to add regulations curtailing this freedom, or adding to the bureau- 
cratic difficulties of initiating change. Thus it might be necessary to 
distinguish between variations in the services and variations in the infor- 
mation collection activities: 

2a. Changes in mode of operation of a service agency that are 
within the legal or customary latitude for change enjoyed by the 
agency will not be interpreted as research under the purview of the 
Commission and related statutes, except with regard to any novel data 
collection activities initiated for the purpose of evaluating the 
change as a program alternative capable of being adopted by other 
similar units. 

There is an ambiguous borderline between information collected for use 

12-7 



in an annual report of an operating agency and that collected for a program 
evaluation done by an in-house staff. Clearly it would be unwise to include 
annual reports or even special-topic operational analyses done to monitor 
regular operations : 

2b. Data collection and analysis done by an institution for opera- 
tional monitoring of its own operations (as opposed to evaluating pro- 
gram alternatives as policy items capable of being disseminated to other 
units) will not be regarded as research for the purposes of this 
Commission and the related laws. 

These proposed regulations have obvious ambiguities, but rather than sug- 
gest specific refinements, it seems better to wait, allowing operating agencies 
to define their activities as they choose until specific problems emerge. We 
must remember that there are Rights-of-Participants issues in every social in- 
stitution and profession, public and private, whether doing research or not, 
and this Commission must avoid taking on this whole responsibility. 

Expressed purpose in the funding of programs may provide guidelines: 

2c. Where funds are specifically designated for evaluation of 
program effectiveness , construction of social indicators, statistical 
analyses of administrative data, etc., the activities undertaken with 
these funds are "research" that should receive clearance as to protec- 
tion of rights-of-participants in research from an Institutional 
Review Board. 

This proposed regulation does not cover the treatment involved (although 
2d below does) but merely the data collection introduced for the evaluation. 
Such an emphasis contrasts with medical therapies, where the dangers of the 
treatment are usually the major concern of an Institutional Review Board. 

Consider a borderline case like "Title I" programs of compensatory educa- 
tion in public schools. In this massive national program, all districts and 
schools meeting specified poverty criteria are eligible to receive funds to 
spend on a variety of special remedial activities of their own devising or 
choosing, but limited to children designated as educationally deficient. While 
a great diversity of innovative and traditional remedial activities are involv- 
ed, these are still within the range of standard operating procedures, and the 
program is funded as a nationwide activity, not a pilot program. However, 
where Congress and the Office of Education fund scientific evaluations of the 
effectiveness of a sample of Title I programs, employing new data collection 
activities, opinion surveys of parents, students, and school personnel, spec- 
ifically administered achievement tests, etc., these latter are judged "research" 
for present purposes. 

There are, however, instances in which the treatment as well as the 
informational research procedures should be reviewed. 

2d. Where the enabling legislation specifies a trial or 
experimental pilot program or demonstration project as well as an 
evaluation budget; where the research contract or grant funding covers 
funds for treatment development and treatment delivery as well as 

12-8 



for evaluative information collection, Institutional Review Boards 
should review the treatment as well as the informational research 
activities of the project. 

Usually the contract RFP's (Requests for Proposals) and grant applica- 
tions will provide adequate grounds for determining this. While the il- 
lustrations have involved governmental programs, privately supported programs 
also come within the scope of the recommendations. 

3. Informed Consent - General 

The blanket inclusion of "behavioral research" in PL9 3-348 may make par- 
ticularly marked changes in extending the concept of informed consent from 
laboratory research into areas such as program evaluation and survey research. 
These effects may be so marked as to result in considerable opposition from 
the research community. However, the principle is so obviously fair that we 
recommend the endorsement of this extension. 

3a. Individually identifiable participants in social research, 
surveys, program evaluation, etc., must be informed: 

3a-l. that research is being conducted; 

3a-2. of the procedures they will be experiencing; 

3a-3. of the risks and benefits reasonably to be expected; 

3a-4. of the purpose of the research; 

3a-5. of the anticipated uses of the information; 

3a-6. of the names, addresses, and telephone numbers of the 

researchers ; 
3a-7. of the names, addresses, and telephone numbers of the 

sponsors of the research; 
3a-8. that they are free to ask questions and may refuse to 

participate; and, 
3a-9. that they may later withdraw from the research, and 

the consequences of such withdrawal (cancellation 

of income subsidies, etc.). 

3b. The exact wording of these statements must be approved by the 
Rights-of -Subjects Review Board. The Board may approve 
modifications of the elements of the informed consent agree- 
ment when: 

3b-l. the risk to a subject is minimal; 

3b-2. rigid adherence to the specified elements of the informed 
consent agreement undermines important objectives of 
the research; and 

3b-3. any reasonable alternative means for attaining these 

objectives would be less advantageous to the parti- 
cipants in the research. 

The elements of this informed consent agreement are similar to the cur- 
rent HEW informed consent regulation used predominantly in biomedical and 
clinical psychological research. (For a discussion of the problems raised 
when the current HEW regulations are extended to social research, see the 
position paper written for the National Commission by Richard A. Tropp.) 

12-9 



However, certain elements have been added to accommodate special problems 
that arise in the context of surveys, program evaluation, etc. 

Informed consent must be obtained only from "individually Identifiable 
participants" in social research. This limitation results in a fairly nar- 
row definition of "subject at risk" as the term is used in the current HEW 
regulations. For example, restriction of the informed consent requirements 
to "participants" in the research will not require the researcher to obtain 
the consent of nonparticipants who might be affected by the treatment, such 
as landlords in a housing allowance experiment. Restriction of the require- 
ment to "individually identifiable" participants would exempt anonymous obser- 
vational studies, etc., in which no jeopardy to the rights of the individual 
participants exists. In rare instances this narrow definition of "subjects 
at risk" may be inadequate, such as in research based on hearsay information 
concerning identifiable individuals. In such rare situations, as in instances 
of anonymous participants and nonparticipants who may be affected by the re- 
search, the broad representation of interests on the Rights-of-Subjects Board 
should insure that the rights of those whose consent is not required will be 
respected. 

Even with this narrow definition of "subjects at risk," major changes in 
the conduct of social research would result. Social researchers will be ex- 
plicitly required to obtain some kind of informed consent of participants. 
Opinion surveys would be required to identify the sponsors and purposes of 
the survey, as well as the research firm conducting the survey. (Note that 
the requirements of information regarding the sponsor's identity (3a-7) and 
the purpose of the research (3a-4) in the previous draft failed to receive 
the endorsement of the majority of commentators. Appendix A, Recommendation 
24.) 

In keeping with the recommendations of Section 5 below, the statements of 
the purpose of the research (3a-4) may stop short of telling the participants 
of experimental treatments that they are not receiving. Even so, such infor- 
mation may influence the degree of cooperation by participants, and, even more 
likely, modify the responses given. It is this latter effect that will most 
disturb the social research profession. However, data collected under these 
conditions can be almost as useful as present surveys. It is comparative 
differences under common contexts that are most informative. Present surveys 
do not provide "absolute" opinions, but rather opinions conditioned by a 
heterogenous set of respondents' surmises and suspicions on the very issues 
that this recommendation would make explicit. Of course, the more explicit 
nature of this information may result in greater attention by respondents to 
these issues, and researchers should anticipate the resulting biases. 

In major experiments such as the New Jersey Negative Income Tax 
Experiments, participants are asked to sign a written consent form. Such 
formality is usually missing from survey research, even in panel studies where 
repeated interviews are envisioned. This recommendation anticipates that in 
most instances, the written consent of the participant must be obtained. In 
situations such as in telephone surveys , where it would be difficult or awkward 
to obtain written consent, some other means of obtaining consents will be per- 
mitted. However, researchers must always bear the burden of showing that the 
individual was properly informed and consented to participation in the research, 
and therefore may wish to require a signed consent form for their own protection. 



12-10 



It has been suggested (see Appendix A, page 8 ,)that separate consents 
be solicited for the experimental treatment and information collection compon- 
ents of social research. Such separation can improve the control and estima- 
tion of attrition bias (Riecken, et al., 1974, 58-59). For the most part, in 
program evaluation, social indicators research, etc., and for control groups 
in experiments, only informational consent forms will be required. 

Recommendation 3b permits the Rights-of -Subjects Review Board to modify 
the elements of the informed consent requirements when the risks to the sub- 
jects are minor and information regarding one or more of the elements of inform- 
ed consent would undermine some important research objective. This recom- 
mendation is similar to the modification clause in the HEW regulations, and 
permits the flexability to accommodate a wide range of social research settings. 
In certain extreme instances, such assessment of the impact of Title 1 funding, 
consent of the participants in the research (consent by the parents of the 
school children) may be waived by the Rights-of -Subjects Review Board. Such 
a waiver would be appropriate when an institution rather than an individual 
is the focus of the study. In such a situation a similar informed consent 
can be obtained from an institution representing the interests of the parti- 
cipants (such as a school board or local governmental body) . 

Some issues of informed consent in social research are left open by this 
recommendation. It does not address the problems of gaining consent from 
special or institutionalized participants (children, prisoners, mental patients). 
These topics are discussed in other papers submitted to the National Commission. 

These proposals on informed consent have not been reviewed in their present 
form by our cooperating readers, and should be regarded with more caution than 
the better-tested sections of this paper. Moreover, insofar as the content of 
these recommendations was covered (Appendix A, Recommendation 24) no favorable 
consensus was found. 

4. Rights and Interests of Respondents in Informational Surveys 

A major part of social and behavioral research involves soliciting 
information from and about respondents by interviews and questionnaires. 
Respondents certainly have interests and risks with regard to information 
about themselves that they have provided. Their interests should also be 
recognized in determining the proper uses of any information that they have 
provided if it is used in ways identifying them as the source. They also 
have rights over information that others have provided in which they may 
have been identified. (It will be argued below that they have no rights 
that are jeopardized in transfers and uses of such data in which their 
identification as a source or target is precluded.) 

The Rights-of-Subjects in survey research, polling, and interviewing 
have received relatively little attention compared to the attention these 
issues have received in other areas of research and record systems. While 
this overview will touch on these problems, it is necessarily limited in its 
coverage. If the National Commission agrees that these problems fall within 
its purview, a special paper centering on the opinion survey industry is 
called for. 



12-11 



The data solicited by interview and questionnaire for program evaluation, 
and social indicator development (or for descriptive surveys serving social 
science or journalistic purposes) often involves information about illegal 
acts. In addition to indicating obvious criminal behavior, information about 
income and income sources may indicate violation of tax or welfare laws . Other 
sensitive information that could result in personal embarrassment or discom- 
forts to the respondent may be solicited. 

The procedures of survey sampling make the identity of the respondent 
known to the interviewer in door-to-door and telephone surveys. Procedures 
for checking on the honesty and accuracy of interviews through reinterviewing 
a portion of the respondents require recording this identity, as do research 
procedures involving reinterviews of the same respondents (e.g., pretests and 
posttests) or linking respondents to program treatments and other information 
sources. 

Subpoena and Government Audit . The Mercer County prosecutor requested 
information about the participants in the New Jersey Negative Income Tax 
Experiment (Watts & Rees, 1973) as a part of a broad search for cases of wel- 
fare cheating. The power of governmental agencies to legally subpoena such 
information creates a real jeopardy to participants in much social research. 
The decenial census and the interim sample surveys conducted by the Bureau of 
the Census are made exempt from such subpoena by acts of Congress. Certain 
enabling legislation in drug abuse research has empowered the Secretary of 
HEW to give such immunity to specific research projects. But the New Jersey 
Negative Income Tax Experiment and most program evaluation research lacks such 
protection. In some cases, researchers have gone to jail or risked going 
rather than release confidential information, while in other cases, confidential 
information has been released (Carroll & Knerr, 1976). 

In the Mercer County case, the project and the prosecutor settled out-of- 
court. The project gave the prosecutor names of recipients and amounts of 
money received from the project, but no information on income or anything 
else that respondents had provided the project. The present writers believe 
that this is also the dividing line that any statutes providing privileged 
communication protection for research data should follow. The actions of 
government and of research agencies must be subject to freedom of information 
requirements. The communiations of cooperating respondents made for the 
purpose of providing research information, however, should be privileged com- 
munications. If law enforcement groups want this information, they can ask 
it of the respondents themselves. Nejelski & Peyser (1975) recommend a broader 
protection, including protection of information about the researcher's actions. 
However, all agree that such a statute should cover the information in all its 
data processing stages, rather than just in the interviewer-interviewee com- 
munication. Such legislation seems unlikely, and the National Commission on 
safeguarding research participants' rights will have to set standards that 
assume subpoena jeopardy. 

Required audits of federally sponsored social experiments may result in 
similar threats to the confidentiality of identifiable information. The 
General Accounting Office, pursuant to a request from a Senate Committee con- 
sidering preliminary analyses from the New Jersey Experiment, sought to audit 
and verify interviews. The project staff gave these auditors full access to 
the computer data from interviews with individual identifiers deleted, and the 

12-12 



GAO produced its own parallel analyses of income guarantee effects. The staff 
also permitted GAO access to a sample of individually identified files to 
audit the accuracy of the transfer from individual files to the record systems 
used in the analysis which may have been in violation of the project's promise 
of confidentiality. Such access was sufficient to meet the purpose of the 
audit without requiring GAO auditors to reinterview the respondents. During 
1975 a similar issue has been raised between the GAO and the Housing Allowance 
Experiment operated by HUD through The Urban Institute, Rand Corporation, and 
Abt and Associates. 

Since, in ordinary public opinion polls, verification by sample reinter- 
view is a standard procedure for checking interviewer honesty and competence, 
it would seem a desirable feature of government auditing of program evaluation 
data. Because such data are assembled as a part of a governmental decision- 
making process, it seems essential that audit, recount, reanalyses, and other 
verification processes be possible. Theoretically it might be possible to ver- 
ify sample surveys by selecting and interviewing independent samples of the 
same size drawn according to the same rules. But since this will rarely be 
feasible, it seems undesirable to preclude verification contacts with the 
original interviewees. It also seems undesirable to violate pledges of con- 
fidentiality to the respondents. Perhaps slight changes in those pledges so 
as to mention the rare possibility of verification interviews to check inter- 
viewer honesty would suffice without reducing respondent cooperation on sen- 
sitive material. If, despite these precautions, the information is so sensi- 
tive that the threat of recontact would substantially impair participation 
in the research, other less intrusive means of establishing response validity 
should be considered (Boruch & Cecil, 1977). 

The possibility of subpoena and of release of names to auditors for re- 
search verification interact crucially with informed consent . The Institutional 
Review Board should examine the specific wordings of the explanation of research 
purpose and pledges of confidentiality made to respondents. Recommended word- 
ings might eventually be prepared. The risks involved will depend upon the 
type of information being requested and degree of cooperation promised by 
local prosecutors and police. 

4a. Where the material solicited involves no obvious jeopardy 
to respondents, a vague, general promise of confidentiality is accept- 
able. E.g., "These interviews will be summarized in group statistics 
so that no one will learn of your individual answers. All interviews 
will be kept confidential. There is a remote chance that you will 
be contacted later to verify the fact that I actually conducted this 
interview and have conducted it completely and honestly." 

4b. Where full and honest answers to the question could jeo- 
pardize a respondent's interests in the case of a subpoena, the re- 
spondent should be so informed. E.g., "These interviews are being 
made to provide average statistical evidence in which individual 
answers will not be identified or identifiable. We will do every- 
thing in our power to keep your answer completely confidential. 



12-13 



Only if so ordered by Court and Judge would we turn over individu- 
ally identified interviews to any other group or government agency. 
We believe that this is very unlikely to happen, because of the 
assurance of cooperation we have received from 



4c. Where the researcher has made the data invulnerable to 
subpoena, as by not himself having the key linking names to code 
members, this being stored beyond reach of subpoena or in some 
agency like the census bureau immune from subpoena, or where the 
researcher has used other procedural or statistical techniques that 
insure the anonymity of the sensitive information, the warning of 
possible subpoena may be omitted from the background statement to the 
respondent. 

The devices are discussed more fully elsewhere (see Boruch & Cecil, 1977, 
and Campbell, Boruch, Schwartz, & Steinberg, 1977, for a review of this lit- 
erature). While they have not been tested in the courts, they are probably 
sure enough, and the dangers of subpoena remote enough, so that omitting men- 
tion of the subpoena possibility creates no real jeopardy. In general, as 
shown in the Appendix (reactions to recommendations 9, 10, and 11) our volunteer 
panel were favorable to these recommendations , although vigorous comments were 
generated. A strong minority found 4b not protective enough. 

Subpoena is probably a rarer threat than accidental release of individual 
information in the form of gossip. Blackmail, though a rare event, should also 
be considered. Thus respondents' rights are involved in the degree to which the 
data processers have access to the data in an individually identified form. 
From the COFAER Report (Rivlin, et al., 1975) come these three recommendations 
that the present authors also endorse. 

4d. Sensitive information should not be collected unless it is 
clearly necessary to the evaluation and is to be used. 

4e. Where it is feasible and does not undermine the validity 
of the evaluation, the anonymity of the respondent should be pre- 
served from the beginning by not collecting identifying information 
at all. 

4f . Identifying information, such as name and address or Social 
Security number, should be removed from the individual records at 
the earliest possible stage of analysis and replaced by a code 
number. The key linking this code number to the identifying infor- 
mation should be stored in a safe place and access to it severely 
limited. This key should be destroyed as soon as it is no longer 
needed. 

Even with individual identifiers removed, individual data should 
probably not be stored on time-sharing computer systems, as this makes 
possible a repeated accessing of the data, utilizing variables that are 
a matter of public record, so as to break the code for some specific 
individuals . 



12-14 



5. Rights and Interests of Participants in Social Experiments with 
Regard to Treatment Variables . 

5a. All participants in an experimental program should be 
informed in advance of all features of the treatment and measurement 
process that they will be experiencing that would subject them to 
any obvious risk or jeopardy and that would be likely to influence 
their decision to participate in the program or their conduct as 
participants in the program. Institutional Review Boards should be 
provided with copies of the statements made to potential participants 
when seeking their consent. 

All experts would probably concur in this recommendation, even though 
there will be many settings in which living up to it will produce less valid 
data than if participants were not informed of certain aspects of the treat- 
ment variable, or kept in ignorance of the fact that an experiment was going 
on. There is a further degree of informed consent, however, that methodolo- 
gists would recommend against. This is the informing of each group of what 
the other groups in the experiment are getting, in particular, informing the 
control group of the desirable treatments the experimental groups are getting. 
The social experimentation committee of the Social Science Council discussed 
this issue at length, and ended up approving this position, since the interests 
of the control group are not jeopardized and since more complete disclosure 
would have potentially destructive effects on the conduct of the research. 
For example, in the New Jersey Negative Income Tax Experiment, the control 
group members were not informed about the maintenance payments of up to $1000 
or $2000 per year to the experimental group members. As it was, some 26% 
of the control group were lost from the experiment in spite of the $15.00 per 
interview four times a year, while only 17, were lost from the best-paying ex- 
perimental group. Envy and resentment, coming from awareness of relative de- 
privation of the control group would almost certainly have added to this dif- 
ferential drop-out rate. 

There are cases, to be sure, in which keeping a control group untreat- 
ed and in ignorance of the availability of the treatment being offered the 
experimental group represents major deprivation of rights and harm to well- 
being. The recently publicized experiment on syphilis treatments started in 
the 1930 's in the South is a case in point. When started, the informed con- 
sent of the participants should have been secured, but the available "cures" 
were so ineffective that the use of a control group restricted to traditional 
treatments was probably not unethical. However, once penicillin became avail- 
able, the dramatic (even if only quasi-experimental) evidence of its effective- 
ness and its plentiful availability, made it immoral to withhold it from the 
experimental group. While a parallel situation is extremely unlikely in the 
realm of program evaluation, the possibility should be kept in mind. 

To return to a discussion of informed consent with regard to experimental 
treatments, in the New Jersey Experiment, it was recognized as essential that 
the recipients of the income supports understand clearly that it was for three 
years only. (This has been the source of such serious criticisms about the 
validity of the experiment for purposes of extrapolating to the impact of a 
permanent national program, that in later experiments small groups are getting 
guarantees of up to 20 years.) Were the experiment to be redone again today, 
the recipients should be warned that information about the payments made by 

12-15 



the project to them would be released to government officials if requested. 

It should also be remembered that many boons are and should be adopted 
on the basis of a consensus of expert judgment and popular demand. If such 
a consensus is present, quasi-experimental designs not involving equally needy 
control groups may have to be used (Riecken, et al. , 1974, Chapter IV). If 
the treatment is in short supply, by making quantitatively explicit the degree 
of need and assigning to treatment on this basis, an especially powerful quasi- 
experimental design is made possible (Riecken, loc. cit.). 

5b. Where there is already expert consensus on the value and 
feasibility of a treatment and where there are adequate supplies of 
the treatment available, needy control groups should not be deprived 
of the treatment. 

It should be noted that pilot programs, experimental programs, and demon- 
stration programs do not come under this exclusion. Such testings of potential 
policies should be done so as to optimally learn of the social costs and bene- 
fits of the program, and this will usually require random assignment of par- 
ticipants to experimental and control conditions. If there is expert consensus 
on the costs, benefits, feasibility, etc., then the program could just as well 
be adopted as national policy at once; if controls cannot ethically be deprived 
of the treatment, then usually the pilot program is not worth doing. However, 
if no one is to get the experimental boon unless others equally needy are left 
without it, then the drawing of lots, random assignment, is a traditional equit- 
able method of assigning the boon. In such circumstances, the controls are 
not being deprived in relation to the general population, but only in relation 
to the temporary experimental recipients. (This condition definitely did not 
hold in the syphilis study.) 

6. Reanalysis of Research Data and Statistical Analysis of Administrative 
Records . 

Here is an area in which some current interpretations of subjects' rights 
are needlessly hampering useful science. Let us begin by proposing an exclus- 
ionary rule. 

6a. The reuse of research data for reanalysis or for novel 
analyses, and the statistical analysis of administrative records, 
jeopardize no individual rights as long as no individually identi- 
fiable data are transferred out of the file of original custody into 
another file. For uses and reuses meeting this requirement, the 
informed consent of the respondents is not required. 

There are horror stories about Institutional Review Boards requiring 
each original subjects' permission for the statistical reanalysis of 20-year 
old intelligence test data even though names and other identifying information 
had been deleted from the data. Certainly this seems a totally unnecessary 
requirement. The Russell Sage Foundation's guidelines for the maintenance of 
school records (Russell Sage Foundation, 1970) suggests parental approval of 
each research use of a child's record. Certainly this should be changed to 
read "for each research use involving the release of individually identified 
records." The most recent draft recommendations to the Privacy Protection Study 



12-16 



Commission suggest that greater access to records for research purposes be per- 
mitted so long as the Information is not used to make a determination about 
any individual (Notice of Hearings and Draft Recommendations: Research and 
Statistics, January 6, 1977)= 

As an example of the practice recommended in 6a, data of the New Jersey 
Negative Income Tax Experiment are now available to social scientists through 
the Institute for Research on Poverty, University of Wisconsin. From the 
data have been deleted names, addresses (but not cities), Social Security 
numbers, names of the family doctor, and a few other specifics that might lead 
to identification. 

6b. Individually identified data (research or administrative) 
may be released to new users for statistical analysis only with per- 
mission of the individual described by and originally generating thi 
data. 

While this rule is consistent with the spirit of the Privacy Act of 1974, 
the draft recommendations of the Privacy Protection Study Commission suggests 
that the Privacy Act be amended to permit greater access to identifiable 
research information without the consent of the individual participants. If 
the act is so amended, we would urge that this proposed rule then be rewritten 
to permit much greater access to research information. 

6c. he lease of research or administrative data to new users 
for statistical analysis when done without the express permission 
of each respondent must be done so as to adequately safeguard all 
individual identities. 

Procedures for achieving this have been described elsewhere (see Boruch 
& Cecil, 1977, and Campbell, Boruch, Schwartz, & Steinberg, 1977, for reviews). 
Usually this would include deletion of the participant's name, address, Social 
Security number, specific birth date (but not year), specific birth place 
(but not geographical region) . Where some of the research variables are pub- 
licly available and can be associated with identifiable individuals (such as 
lists and descriptions of members of a school or a professional association) , 
it may also be necessary to delete this information or use crude report cate- 
gories for the variables that are in these accessible lists. Even where 
multiple tables of frequencies or percentages are presented, rather than 
individual-level data, detective work may make possible the uncovery of in- 
dividual identified information. Restrictions on minimal cell frequency and 
randomized rounding may be required in such cases. 

6d. The original custodians of research or administrative data 
may generate and release to others statistical products in which 
individual information is not identifiable, including statistical 
products not anticipated by the individuals initially generating the 
data. 

It is anticipated that in the future the requirements of respondent 
confidentiality and of hard-headed meaningful program evaluation will be 
resolved by increasing the data-analysis capabilities of administrative 
record files. Through the "Mutually Insulated File Linkage" (Campbell, 
Boruch, Schwartz, & Steinberg, 1977), the records of two files can be 

12-17 



statistically linked without exchanging any individually identified data, thus 
conforming to this rule. But this procedure requires that the custodial file 
be able to do standard statistical analyses as well as internal data retrieval 
for individuals. For many ameliorative programs, government records on sub- 
sequent earnings and unemployment compensation would provide accurate and in- 
expensive measures of effects. While these procedures would have their own 
problems, almost certainly they would avoid the differential attrition rate 
found for the interviews in the New Jersey study. Accordingly, it would be 
in the government's interest to increase the internal data retrieval and 
statistical analysis capacities of private health insurance, auto insurance, 
educational testing agencies, hospitals, schools, etc., so that these data 
could be used in program evaluation and social indicator generation in ways 
precluding identifying individual data. 

For many psychological studies in college settings, it would be desir- 
able to statistically correlate laboratory performance and general intel- 
ligence or grade point average from school records. This could be done either 
with individual permission, or through mutually insulated file linkage, in 
which regular registrar staff members were paid to work overtime to retrieve 
the relevant data on specified lists of persons, transform these to means and 
standard deviations by lists, and then return only these summary statistics by 
list. 

While it is beyond the scope of the National Commission, it should be 
noted that privacy legislation curtailing the use of Social Security numbers 
as all-purpose individual identifiers hinders the uses just described. Greater 
protection of individual privacy can be achieved by prohibiting unified data 
banks. No abuse of privacy has resulted from the limited use of social 
security numbers in research. The prohibition of the use of social security 
numbers for research purposes is a needless and harmful precaution. 

7. Future Controversial Issues . 

The above sections have hastily sketched some of the major areas of 
concern that are "timely," in the sense that they are in tune with the con- 
cerns of Congress in setting up the Commission, and also represent to a con- 
siderable degree an emerging consensus among the quantitative social 
scientists engaged in program evaluation and social indicator development. 
(Section 3, Informed Consent, as it affects opinion surveys may have gone 
beyond this consensus.) 

This present consensus, however, may be seen as but the current form of 
a growing shift in public consciousness about the rights-of -subjects as a 
part of an increasingly equalitarian participatory democracy. It may help 
the Commission to consider what the parallel set of demands 10 years hence 
might also contain. The following three topics are included for this purpose. 

Respondents' Interests in the Topics on Which Data are Collected . A 
recent trend in criticism of research on social problems, including evaluation 
research, goes under the name "blaming the victim" (Ryan, 1971; Caplan & 
Nelson, 1973) . There is a recurrent option in program evaluation and social 
indicator research as to whether evidence of a social problem is indexed as an 
attribute of the individual or as an attribute of the social setting and the 
social institutions present. When the data are indexed as individual attributes 



12-18 



(ability, morale, personality, employment status) this predisposes the analysis 
to end up "blaming the victims" of social system malfunction for their lot in 
life. Many times there are options in the wordings of questions that can make 
big differences in the social causation implied even while collecting very near- 
ly the same data. Standards could be developed requiring that articulate spokes- 
men of the program recipient population be asked to check on the research instru- 
ments in this regard. Or more specific recommendations could be developed, such 
as recommending the social setting attributional format wherever the option 
existed. Shifts of this kind might be of practical value as well. In many urban 
ghetto settings, opinion surveys meet with mass boycott, greatly hampering the 
evaluation of new alternatives in social welfare services delivery. In most 
such instances, the program evaluation purposes would be served just as well 
by substituting "is this service effective" questions for the "are you sick" 
questions. The conceptual shift is to turn the welfare recipient into an expert 
on the quality of welfare services delivered rather than a source of evidence 
about his own inadequacies. This shift, plus one on rights to the results be- 
low, will almost certainly increase the cooperation received, and turn the in- 
formational survey into a useful vehicle for communicating neighborhood com- 
plaints to government. We have not developed a recommendation in this area, 
and the reactions of our panel of readers of the earlier draft (See the Appendix, 
points 21 and 22) shows that no consensus exists to support such a recommendation. 

Note that the "blaming the victim" theme is only one illustration of such 
respondents' interests. The more general class is discussed in the next section. 

Class or Category, Privacy, Interests, and Rights . This paper and the 
National Commissions' activities as a whole have assumed that the rights-of- 
subjects are individual rights. Jeopardy to the rights of a class or category 
to which the subject belongs have not been considered. Most discussions 
of rights-of -subjects join us on this. Class rights are a Pandora's box that, 
if given recognition, would totally preclude most social science research. The 
present writers recommend that we continue to refrain from recognizing such 
rights in research ethics but that we make this decision self-consciously, with 
some recognition of the issues we are neglecting. 

Some examples: The American Council on Education from anonymous surveys 
of college students prepared a profile of the activist campus radical who had 
been involved in destruction of property and disruption of speeches, etc. No 
radical respondent was thereby jeopardized for the past acts confessed to, 
since the data were genuinely anonymous in their initial collection by mailed 
ballot. But the interests of current and future radicals are jeopardized. For 
example, college admissions offices seeking to exclude such students, could do 
so on an actuarial basis by asking applicants the profile questions about back- 
grounds, interests, activities, and values, and excluding those applicants who 
fit the profile with a large proportion of the predisposing signs. In such 
a case, the proper protection may be to increase the legal accountability of 
college admissions procedures by prohibiting the use of anything but academic 
competence criteria. Rules seeking to preclude such class or category jeopardy 
in research seem to us unacceptable in their likely coverage. 

The statistical analyses by the Bureau of Internal Revenue might show 
that M.D.'s of certain types have twice the income of other professionals. 
This jeopardizes the interests of these M.D.'s by increasing the frequency 
with which they are approached by fund raisers, confidence men, and burglars, 

12-19 



and by the invidiously focused zeal of internal revenue agents. Yet such class 
and category social statistics seem to us absolutely essential for the govern- 
ance of a democracy in which past governmental decisions are a major determin- 
ant of income inequities even in the free market sector of the economy. 

Black leaders are jus-tifiably disturbed about social statistics reporting 
on invidious black-white comparisons in achievement test scores and crime rates. 
Perhaps even data on income and rental costs could be regarded as prejudicial. 
Yet these data seem essential background evidence on which to base governmental 
action seeking to remove the traditional environmental disadvantages blacks 
live under. The Civil Rights movement has had to reverse itself on this within 
the last 25 years. For example, in 1950 those working on reducing the de facto 
segregation in the Chicago schools had as their goal color-blind assignment of 
children to school districts and setting of school district boundaries. At that 
time open or disguised records indicated the race of every child and teacher. 
Within ten years, the Chicago school system was stonewalling those pushing for 
more integration by asserting that they had no way of telling which teachers 
and pupils were black. To achieve real integration, racial identification had 
to be made known and counted by categories. Affirmative action and school 
integration would be impossible without it. 

At the present time, the no doubt environmentally produced black-white 
difference in school achievement tests has been so redundantly documented 
and is so regularly misinterpreted as evidence of an innate racial inferiority, 
that one of us has called for a cessation on all such research unless ac- 
companied by thorough measurement of the black-white differential in oppor- 
tunities to learn the specific items the tests employ (Campbell & Frey, 1970). 
Considering the problem of class or category rights as a whole, however, we 
are reluctant to see any such appeal made a compulsory rule. 

Respondent Rights to Data Produced . It will increasingly be argued in 
the future that the participants in research, the interviewees in public opin- 
ion surveys, etc., aire co-producers of the research product, and should be co- 
owners of that product with an equal right to know the results and to use that 
information in political arguments and in other ways. This could lead to the 
rule that all respondents to an informational survey should be provided with 
the statistical results produced. Such a rule could be implemented by having 
these results placed in the nearest public library to each respondent. 

Another way of arriving at such a proposal is to recognize that where 
such surveys are a part of governmental decision-making t the voting booth 
rather than the animal laboratory becomes the relevant model. Just as voters 
get to know and use the results of elections they have voted in, so too they 
should know the results of surveys and interviews they have participated in. 
This equalitarian emphasis is supported by an analysis that sees researchers 
as a potentially self-serving elite who may exploit the cooperative efforts 
of the respondents by producing products that may be used to harm the interests 
of the respondents. While in medical and physical research, the results might 
not usually be meaningful and useful to the respondents, for most social science 
surveys they would be. 

The present writers would be happy to have this adopted right now as 
standard operating procedure for all public opinion polls as well as evaluation 
research, including private polls now never published. Along with this would 

12-20 



go full information prior to the questioning as to who was paying for each 
question and how the information would be used. These rules would decrease 
the descriptive value of opinion surveys, in that answers would be more con- 
sciously given so as to produce politically desired statistical results. 
However, we believe the trends in political conscience are such that in 10 
or 20 years we will have to live with these limitations. (This proposal 
received a bare majority of endorsements in our volunteer panel, as reported 
in the Appendix under Recommendation 24.) 

Summary 

This background paper for the National Commission for the Protection of 
Human Subjects of Biomedical and Behavioral Research asserts that research 
in program evaluation, social experimentation, social indicator research, 
survey research, secondary analysis of research data and statistical analysis 
of data from administrative records are and should be covered by PL93-348 and 
other rights-of-subjects legislation. 

Because this vastly increases the burden on existing Review Boards, and 
because actual cases of abuse of subjects' rights are essentially nonexistent 
in this area, a procedure of conditional clearance affidavit is suggested that, 
at the discretion of the Review Board, might substitute for full review in 
most cases. Greater numbers and new types of Rights-of -Subjects Review Boards 
will be needed. 

Most jeopardies to rights-of-subjects in these areas will come from the 
information about them that is collected. In the boundary between research 
and practice, it is recommended that shifts in administrative policy that are 
normally within an administrator's discretion not be regarded as research, 
but that novel data collection procedures designed to evaluate such changes 
be classified as research and subject to Review Board scrutiny. 

Extending the right of informed consent into these areas, especially 
survey research and other information gathering activities, will require 
major procedural changes that will seem to threaten the validity of results. 
This extension is nonetheless recommended. Informing respondents of the risks 
of verificational interviews and of subpoena of information is recommended 
where these risks exist. 

It is recommended that reanalysis of research data and statistical 
analyses of administrative records be permitted without respondent permission 
where no individually identifiable data are transmitted out of the original 
file of custody. 

In future decades, issues of class rights, of respondents' interest in 
question form to avoid blaming the victim, and of respondents' co-ownership 
of the research results will have to be faced. While the Commission's atten- 
tion is called to these areas, no formal recommendations are offered. 

References 

Boruch, R. F., & Cecil, J. S. Methods for assuring confidentiality of social 
research data . (In press, Rose Monograph Series, New York: Cambridge 
University Press, 1977) 

12-21 



Campbell, D. T., Boruch, R. F., Schwartz, R. D., & Steinberg, J. 

Confidentiality-preserving modes of access to files and to inter- 
file exchange for useful statistical analysis. Evaluation Quarterly 
1(2), 1977, pp. 269-300. 

Campbell, D. T., & Frey, P. W., The implications of learning theory for the 
fade-out of gains from compensatory education. In J. Hellmuth (Ed.). 
Compensatory education: A national debate , Vol. 3, Disadvantaged child . 
N.Y.: Brunner/Mazel, 1970, 455-463. 

Cap Ian, N., & Nelson, S. D. On being useful: The nature and consequences of 
psychological research on social problems. American Psychologist , 1973, 
199-211. 

Carroll, J. D., & Knerr, C. R. Law and the regulation of social science 
research: Confidentiality as a case study. Presented at the Symposium 
on Ethical Issues in Social Science Research, Department of Sociology, 
University of Minnesota, April 9, 1976. 

Nejelski, P., & Peyser, H. A researcher's shield statute: Guarding against 
the compulsory disclosure of research data. In Appendix B, (86 pp.), 
in Rivlin, et al., 1975, (NRC-NAS) . 

Notice of hearings and draft recommendations. Research and Statistics. 
Hearings of the Privacy Protection Study Commission, Washington, D.C.: 
January 6, 1977. 

Riecken, H. W., Boruch, R. F., Campbell, D. T., Caplan, N., Glennan, T. K., 
Pratt, J., Rees, A., & Williams, W. Social experimentation: A method for 
planning and evaluating social intervention . New York: Academic Press, 1974. 

Ritchie-Calder, Lord, Chairman. Does research threaten privacy or does 
privacy threaten research? British Association for the Advancement of 
Science, London, June 1974. 

Rivlin, A., et al. Protecting individual privacy in evaluation research . 
Final Report of the National Research Council Committee on Federal 
Agency Evaluation Research, Washington, D.C.: National Research 
Council, National Academy of Sciences, 1975. 

Rivlin, A. M., & Timpane, P. M. Ethical and legal issues of social experi- 
mentation. Washington, D.C.: Brookings Institution, 1975. 

Russell Sage Foundation. Guidelines for the collection, maintenance and 

dissemination of pupil records . New York: Russell Sage Foundation, 1970. 

Ryan, W. Blaming the victim . New York: Pantheon, 1971. 

Tropp, Richard A. Extending the HEW guidelines to social science research. 
Paper prepared for the symposium: "Ethical Issues in Social Science 
Research." Department of Sociology, Paul Davidson Reynolds, Symposium 
Coordinator, University of Minnesota, 1976. 

Watts, H. W., & Rees, A. (Ed.). Final report of the New Jersey Graduated 
work incentive experiment . Vol. I. An overview of the labor supply 
results and of central labor-supply results (700 pp.). Vol. II. 
Studies relating to the validity and generalizability of the results 
(250 pp.). Vol. III. Response with respect to expenditure, health , 
and social behavior and technical notes (300 pp.). Madison, Wise: 
Institute for Research on Poverty, University of Wisconsin, 1973, (Duplicated.) 



12-22 



References (Cont'd.) 

HEW Regulations. "Protection of Human Subjects" (Federal Register, Vol. 39, 
No. 105, Pt. 2 (1974) pp. 18914-18920.) 

Title II of the National Research Service Award Act of 1974 (PL93-348) . 

* * * * 



Two appendices to this report are available upon request. 

Appendix A (23 pp) provides a summary of the reactions to the 25 
recommendations contained in the 3 Jan 76 Draft of this report. 

Appendix B (74 pp) provides the full details of the written comments, 
the names and addresses of those reacting to the 3 Jan 76 Draft, and a 
list of the lists from which came the 400 names of those asked to comment. 



12-23 



13 

RESPONSE TO COMMISSION DUTIES AS DETAILED IN 
P.L. 93-348, SEC. 202(a) (1 ) (B)(i ) 

Donald Gallant, M.D. 



Response to Commission Duties as Detailed in PL 93-348, 
Sec. 202 (a)(l)(B)(i) 



Don M. Gallant, M.D. 
Professor of Psychiatry 
Tulane University School of Medicine 



Before considering the boundaries between research and therapy in 
the field of mental health, I should first state that the original charge 
to the National Commission for the Protection of Human Subjects of Bio- 
medical and Behavioral Research (NCPHSBBR) in Public Law 93-348, Section 
202 (a)(l)(B)(i) totally ignored the reality that the present "accepted 
and routine practice of medicine" is frequently less than adequate in 
many sections of the United States. Thus, "accepted and routine prac- 
tice" of medicine by some physicians includes techniques that have not 
been scientifically proved in a valid manner and could, therefore, be 
considered research. In many cases, the "accepted and routine practice 
of medicine" deviates from the "intelligent" practice of medicine to 
such an extent that the ignorant physician is actually conducting re- 
search without the realization that he is utilizing unproved techniques 
in the treatment of his patient. Excellent examples of this situation 
are detailed in an article, "The Prescribed Environment," by Dr. Harry 
Dowling that was published in the Saturday Review of April 3, 1971 (pages 
58 through 60). Practices in surgery such as the use of prophylactic an- 
tibiotics for inguinal hernia operations are still standard practice in 
a number of communities; yet this treatment approach is not based on any 
scientifically valid observations or statistically significant experimental 

13-1 



results, thus placing this "standard practice" in the area of research. 
This same article refers to a survey of the use of antibiotics in 76 
community hospitals in which a review of 85,000 patients' charts showed 
that only 54 percent of the patients were receiving antibiotics based 
upon justifiable reasons. Thus, the use of the term "accepted and rou- 
tine practice of medicine" in PL 93-348 is somewhat misleading and makes 
it impossible to separate definitions of research from intelligent inno- 
vative medical practice or from ignorant medical practice which frequent- 
ly is "accepted and routine practice of medicine." If this concept of 
"accepted and routine practice" were allowed to prevail, the eventual 
accomplishment would be the least common denominator or a relatively 
uniform standard of mediocre medical practice. Perhaps more appropri- 
ate terminology might have been, "the boundaries between biomedical or 
behavioral research involving human subjects and the competent practice 
of medicine based upon scientifically valid experimentation." 

To reinforce this viewpoint and attempt to show that this is not 
merely a difference in semantics, it should be pointed out that "blood- 
letting" was still included in the "accepted and routine practice of 
medicine" in the early Nineteenth Century. This procedure was still 
being utilized at that time despite the fact that it was based upon no 
scientifically valid experiments with controlled observations more than 
50 years after Lind had demonstrated the value of controlled experimen- 
tation. At present, the same lack of scientifically valid data applies 
to classical psychoanalysis, encounter group therapy, marathon group 
therapy, etc. Another example may be seen in surgical practice. Until 

13-2 



recent years, it was the standard practice in this country to use radical 
mastectomy for the treatment of breast cancer. However, as detailed in 
the book, Medical Experimentation , by Charles Fried (pages 48 and 49), 
radical mastectomy does not result in a higher incidence of therapeutic 
success than simple mastectomy. The use of radical mastectomy in this 
country was not based upon scientifically valid experimentation but was 
considered to be part of the "accepted and routine practice of medicine." 
In research conducted by teams of doctors in Great Britain and Denmark, 
it was concluded that radical mastectomy was not more successful than 
simple mastectomy concerning recurrence rate or mortality rate. The 
use of the term, "accepted and routine practice of medicine," bears the 
connotation of competent and best available techniques. However, the 
above examples demonstrate the inadequacies of certain "accepted and 
routine practices of medicine." 

Definitions 

This section will define the " competent practice of medicine" and 
"research." The definition of "accepted and routine practice of medi- 
cine" should be based upon the requirement that the therapeutic tech- 
nique should have been shown to have been more successful in a statis- 
tically significant manner than any type of inert (placebo) therapy ap- 
proach and the benefits of the treatment technique outweigh the risks. 
This definition of the " competent practice of medicine" enables us to 
more clearly differentiate research from the practice of medicine. The 
intent of all legislation should be to improve the welfare of the 

13-3 



community. Thus, the framers of this particular piece of legislation are 
obligated to upgrade the practice of medicine if they intend to delineate 
research from the "competent practice of medicine." Present routine or 
accepted practices of medicine that are not based upon scientifically 
proved observations should be allowed to continue temporarily, but reg- 
ulations must be established to require the evaluation of such techniques 
which have never been shown to be significantly superior to an inactive 
or inert type of treatment approach. 

Biomedical or behavioral research involving human subjects should be 
defined as well-designed and critical investigations of therapeutic tech- 
niques with unknown efficacy and/or risks or an attempt to find the eti- 
ology of a disease having for its aim the discovery of new facts associ- 
ated with the "accepted and routine practice of medicine" with the ulti- 
mate goal of providing beneficial effects for human subjects. 

A Proposed Method for Delineating "Research" from the "Competent Practice 
of Medicine" Based Upon Scientifically Proved Experiments 

In his paper, Dr. R. Levine raised some important questions about 
specific problems relating to the boundaries between research and the 
practice of medicine. Any question of boundaries could be reviewed by 
a local Extraordinary Treatment Committee (ETC) which would consist of 
legal advisors and physicians not associated with the clinic or institu- 
tion. This type of Extraordinary Treatment Committee has been detailed 
in the Wyatt v. Stickney case, 1972. The first level of the review would 



13-4 



be a local treatment review committee; the next level should be consti- 
tuted of regional appeal boards; the highest appeal authority would be 
a national board with the same approximate composition as the local ones 
but involving persons of national stature, to evolve review standards 
and clarify the questions. Responsibility for establishing the guide- 
lines for these independent Extraordinary Treatment Committees (ETC) 
should be assigned to your commission (NCPHSBBR). It is my own personal 
recommendation that, in addition to the scientists and legal consultants 
etc., an expert in statistics be assigned to each of these committees. 
(At present, we are making the same recommendation in regard to the In- 
stitutional Review Boards.) Such a committee may be more appropriate 
for review of the problem under consideration than the Professional 
Standards Review Committees (PSRO). 

In those treatment procedures which are not based upon scientifical- 
ly significant observations, it is particularly essential that full in- 
formed consent be obtained from the patient. The basic elements of this 
informed consent should be: 

1) An explanation of the procedures to be followed, including an 
identification of those which are not based upon scientifically 
valid observations or statistically significant results and 
thus are experimental; 

2) A description of the attendant discomforts and risks; 

3) A description of the benefits which may be expected; 

4) A disclosure of appropriate and available alternative proce- 
dures that would be advantageous for the patient; 

5) An offer to answer any inquiries concerning the procedures; 

13-5 



6) An instruction to the patient that he is free to withdraw his 
consent and discontinue the treatment at any time: 

7) The physician has the continuing responsibility to inform the 
patient about any significant new information arising from 
other sources which might affect the patient's choice to con*- 
tinue the treatment; 

8) In cases where a patient is mentally incompetent or too young 
to comprehend, informed consent must be obtained from one who 
is legally authorized to consent in behalf of the proposed sub- 
ject, (Of course, this type of permission varies from state 

to state,) However, where the subject is a child who has reached 
the age of some discretion such as adolescence or if the patient 
is otherwise mentally competent, the physician should obtain the 
patient's consent in addition to that of the person legally au- 
thorized to consent on his behalf. 

Since behavioral therapy, psychotherapy, psychoanalysis, and other 
types of verbal and physical techniques (as well as pharmacologic medica- 
tions) may have important consequences for the patient's life, the patient 
should definitely have the opportunity to obtain adequate information 
about the proposed treatment technique and then make his or her own judge- 
ments whether or not to undergo treatment with a therapeutic technique 
that has not been scientifically proved to be statistically significant in 
relation to an inert technique. The Wyatt case has already established 
this principle with regard to electroconvulsive therapy, aversive condi- 
tioning, and psychosurgery. The same principles should be applied to other 

13-6 



types of treatment. The real problem arises with the non-medical per- 
son who does not require licensure in his locality to utilize behavioral 
or verbal techniques with patients. This type of individual would not 
be subject to the authority of the Extraordinary Treatment Committee ; 
this important gap and potentially dangerous situation must be corrected 
by the NCPHSBBR. 

In addition to having the opportunity to review and reject a treat- 
ment program which has not been based upon scientifically valid observa- 
tions, the patient should also have the opportunity to receive a new medi- 
cation or innovative treatment approach if previously available scientif- 
ically valid techniques have failed. In an opinion rendered by the Attor- 
ney-General of the State of Louisiana (0pinion:74-1675, 1974), it is re- 
cognized that "patients who are committed to state mental hospitals have 
a constitutional right to receive such individual treatment as would give 
each of them a 'realistic' opportunity to be cured or to improve." An 
Extraordinary Treatment Committee (ETC) should be available to give appro- 
val to a physician who wants to increase the dosage of medication for a 
"drug-refractory" schizophrenic patient above the maximal dosages recom- 
mended by the FDA. A readily available ETC would be essential for the 
innovative, intelligent physician who understands how to apply a variety 
of pharmacologic techniques or behavioral techniques for the welfare and 
benefit of the patient. New behavioral therapy approaches or innovative 
types of group encounter techniques practiced by either physicians or lay 
therapists would have to be reviewed by the same ETC. Thus, the ETC would 
require several full-time administrative staff members as well as rotating 
part-time professional members, since many of the present techniques that 

13-7 



are utilized in psychotherapy and behavioral therapy (as well as in other 
fields of medical practice) lack scientific validity. It would be too 
difficult to find competent professional people in the field of medicine 
who would be willing to serve on a full-time basis on the ETC. 

It should be noted that the literature contains a number of valid 
scientific observations concerning psychotherapy and behavioral therapy. 
One such article by Sloane et al. (American Journal of Psychiatry 132: 
373-377, 1975) reviewed a controlled evaluation of 94 patients with an- 
xiety neurosis or personality disorder who had been randomly assigned 
for 4 months to a waiting list, behavioral therapy, or psychoanalytically 
oriented therapy. The two treatment groups improved equally well and sig- 
nificantly more than those on the waiting list at the end of 4 months. 
However, one year and two years after the initial assessment, all groups 
including the waiting list group were found to be equally and significant- 
ly improved. Thus, the Extraordinary Treatment Committee as well as the 
Institutional Review Board will have difficult problems in evaluating the 
acceptable duration of treatment time as well as specific treatment tech- 
nique. Theoretically, all treatment techniques, including behavioral ap- 
proaches such as individual therapy, group therapy, encounter therapy, 
etc. should be based on valid, controlled research data which show the 
therapy to be significantly superior to non-specific treatment approaches. 
There is no doubt that this requirement would cause a heavy administra- 
tive burden on a local as well as national basis, but this approach should 
eventually result in maintaining a competent standard for the practice of 
medicine, and the requirement would help to differentiate more clearly 

13-8 



between research and the competent practice of medicine, as compared 
to the subjective attempt to understand the physician's "intent" when 
he uses a scientifically unproved technique to treat his patient. 

If one were to accept the legislative assignment to the committee 
as detailed in Section 202 (a)(l )(B)(i ) , there would be no other choice 
than to accept Dr. R. Levine's differentiation between research and the 
"accepted and routine practice of medicine," which relies mainly upon 
intent. From this point of view, it would then be impossible to "read" 
the physician's mind accurately and separate the innovative practitioner 
of medicine from the researcher. A readily accessible Extraordinary 
Treatment Committee would be of help to the innovative physician while 
halting the incompetent physician from continuing an "accepted or rou- 
tine practice" that has no scientific validity or therapeutic efficacy. 
A specific recent example of the problems in this area can be seen in 
the use of propanolol (Inderal) in the United States. Propanolol was 
approved for use by the FDA for certain types of cardiac conditions but 
had not been approved for use in hypertension. However, hundreds of 
United States physicians being familiar with the European literature 
describing the efficacy of propanolol in patients who presented high 
blood pressure, were utilizing propanolol for their patients with high 
blood pressure. When propanolol is used in a sensible manner, it can 
be of definite help to some patients with hypertension or high blood 
pressure, and it also is of help to patients who have familial tremor. 
However, the use of propanolol was not an "accepted and routine prac- 
tice of medicine;" thus the inference in PL 93-348 would have been that 
propanolol was being used in a research approach, but this medical 

13-9 



technique would not have been defined as research by Dr, Levine, who 
recognized that the "intent" was based upon scientifically valid data 
from Europe and that the physician was not experimenting with the pa- 
tient but was using propanolol as a therapeutic tool. A readily acces- 
sible Extraordinary Treatment Committee (ETC) would have given the prac- 
ticing physician permission to use propanolol as a therapeutic agent and 
would not have required the physician to submit a research protocol to 
the IRB to prove the therapeutic efficacy of the agent which had already 
been accomplished in Europe. Therefore, the ETC should have individuals 
who are experts in the various medical research specialities available 
for ad hoc consultation. The words " available " and " readily accessible " 
are underlined because these requirements would be absolutely essential 
if new therapeutic techniques are to be made available to patients with- 
out undue delay. 

However, there is no doubt that a need also exists for this same 
Extraordinary Treatment Committee to eliminate those ineffective medical 
practices or effectual psychotherapeutic techniques still considered to 
be "accepted and routine practices of medicine" in some communities. 
Despite all of the available well-designed research studies that show 
the significant efficacy of antipsychotic compounds in schizophrenia, 
there are still some psychiatrists who use only psychotherapy in treat- 
ing those schizophrenic patients, while keeping these patients institu- 
tionalized for long durations of time at great financial costs to the 
families. This type of current medical practice would have to be eval- 
uated by the Extraordinary Treatment Committee. If this new legislation 
is to adequately protect the human subject (patient or research patient 

13-10 



or volunteer) in biomedical and behavioral research, Section 202 (a)(1) 
(B)(i) should be written as follows: "shall consider ... (i) The boun- 
daries between biomedical or behavioral research involving human subjects 
and the competent practice of medicine based upon scientifically valid 
experimentation." As stated previously, those current medical treatment 
techniques that have not been validated by controlled scientific obser- 
vations may be allowed to be continued on a temporary basis. However, 
governmental support of statistical evaluations and comparisons of the 
presently unproved techniques now utilized as "accepted and routine prac- 
tice of medicine" should be immediately initiated. Thus, the government 
would fulfill its obligations to upgrade the standard of medical practice 
as well as to protect the human subject in biomedical and behavioral re- 
search. 

Additional Examples for Caution in the Development and Interpretation of 
Guidelines 

Since research in the field of psychopharmacology is much more ex- 
tensive and more reliable than in the area of behavioral therapy or psycho- 
therapy, I should like to refer to some problems of psychopharmacology 
(which is only another therapeutic tool in the treatment of mental ill- 
ness) that the committee should be aware of in preparing its recommenda- 
tions to the President, the Congress, and the Secretary. In the use of 
antipsychotic medications for schizophrenic patients, there are at least 
six major drug variables which determine the differences in dosage that 
patients require. In fact, these same drug dosage variables apply to 

13-11 



all oral medications ingested by all of us. 

1) Each of us may react differently to a drug if the setting or 
environment is changed, 

2) Each one of us has a unique interpersonal reaction to the per- 
son administering the drug which may affect our reaction to the 
medication, 

3) The absorption rate of the drug may vary according to whether 
it is dispensed in capsule or tablet form. 

4) Each one of us absorbs at a different rate from the gastroin- 
testinal tract, 

5) Each one of us metabolizes or "burns up" the drug at different 
rates as it passes through the liver. 

6) The end-organ for which the drug is intended (in the case of 
schizophrenia, the brain) requires a different blood concentra- 
tion in each individual. 

Considering these six major variables that affect the response to drugs 
or medications, one can easily understnad why one patient might require 
5 times the dosage of Dilantin to stop his epileptic seizures as another 
patient, and some schizophrenics may require four or five-fold increases 
in maximal dosages of medication in order to show a therapeutic response. 
Thus, when the FDA approves a maximal recommended dosage, which is then 
printed in the Physician's Desk Reference , this current "accepted" stan- 
dard guideline may hinder the competent physician who is knowledgeable 
in the area of pharmacodynamics, which considers the above major varia- 
bles in drug metabolism. In the Wyatt case which has accomplished much 

13-12 



good, we also see a hinderance of the intelligent physician when we come 
to the court guidelines which utilize the Physician's Desk Reference for 
maximal dosage. A physician at one of the state hospitals in Alabama 
had to write to the judge responsible for the case as follows: "... the 
alternative to the constraints placed on adequate treatment of an individ- 
ual using the FDA level requires a combination of several different drugs 
up to the prescribed levels in order to achieve the appropriate psychia- 
tric treatment effects for the patient. The latter alternative, while 
somewhat effective, does raise a question as to the appropriateness of 
combining medications to achieve an effect of a single medication with a 
dosage that exceeds the FDA levels. Individual patients have different 
levels of tolerance to medications which makes almost every administra- 
tion and dosage level an individualized one," Thus, this physician had 
been placed in a position of using what we call polypharmacy which is 
usually bad medical practice; this type of polypharmacy had been inad- 
vertently caused by the guidelines set by the court. Thus, in getting 
guidelines to decrease the mistakes of the incompetent physician, the 
court unfortunately also hindered the knowledgeable physician from using 
this knowledge for the welfare of the patient. However, in the same case 
the court offered helpful guidelines for aversive conditioning which was 
designed to alter aggressive behavior. The court made the final recommen- 
dations that: 

... no patient shall be subjected to any aversive conditioning or 
systematic attempts to alter his behavior by means of painful or 
noxious stimuli except under the following conditions: a) a pro- 
gram of aversive conditioning recommended by a Qualified Mental 
Health Professional trained and experienced in the use of aversive 
conditioning. This recommendation shall be made in writing with 
detailed clinical justification and explanation of which alterna- 
tives and treatments were considered and why they were rejected ... 

13-13 



b) any program with aversive therapy proposed for the benefit of 
institution patients shall have been reviewed and approved by that 
institution's Human Rights Committee before its use and shall be 
recommended by Qualified Mental Health Profession for an individ- 
ual patient ... c) the patient has given his expressed and informed 
consent in writing to the administration of aversive conditioning 
... d) no aversive conditioning shall be imposed on any patient 
without the prior approval of the Extraordinary Treatment Commit- 
tee, formed in accordance with this paragraph, whose parent res- 
ponsibility it is to determine, after appropriate inquiry and in- 
terview with the patient, whether the patient's consent to such 
therapy is, in fact, knowing, intelligent, and voluntary and 
whether the proposed treatment is in the best interest of the pa- 
tient. The Extraordinary Treatment Committee shall consist of 
five members to be nominated by the Human Rights Committee of 
the hospital and appointed by Court. The members shall be so se- 
lected that the committee will be competent to deal with the medi- 
cal, psychological, psychiatric, legal, social and ethical issues 
involved in such treatment methods; to this end, at least one mem- 
ber shall be a neurologist or specialist in internal medicine; at 
least one member shall be an attorney acting as the patient advo- 
cate and licensed to practice law in this state. No member shall 
be an officer, employee or agent of the Department of Mental Health; 
nor may any member be otherwise involved in the proposed treatment. 



The court order goes on to state that "no patient shall be subjected to 
an aversive conditioning program which attempts to extinguish or alter 
socially appropriate behavior to develop new behavior patterns for the 
sole or primary purpose of institutional convenience." Thus, easy 
availability and accessibility of the ETC for the evaluation of the 
aversive conditioning technique would be of essential help in protecting 
the subject. If the aversive technique were based only upon empirical 
observations in other medical reports and not upon scientifically valid, 
controlled studies, it would then be the responsibility of the ETC to 
require that a controlled trial of the specific aversion technique be 
conducted, with the protocol approved by the local Institutional Review 
Board, before the technique is utilized as a standard or routine treat- 
ment procedure. 

13-14 



Further Explanation of the Recommendation to Change the Wording in Sec - 
tion 202 (a)(1)(B)(i) 

I have previously suggested that the consideration for the NCPHSBBR 
should have been "the boundaries between biomedical or behavioral research 
involving human subjects and the competent practice of medicine based upon 
scientifically valid experimentation." The change in the wording has been 
recommended because it helps to differentiate clearly between research and 
what should be "the competent practice of medicine" rather than the "accep- 
ted and routine practice of medicine" which confuses the entire assignment 
given to the NCPHSBBR. Using this change in wording delineates research 
from the practice of medicine and defines the major difference. In addi- 
tion, this wording may be utilized as a guideline that not only protects 
the research patient against the incompetent physician but may also be 
used to help develop and maintain competent treatment methods for patients; 
it may further serve to help the patient understand his particular role in 
relation to the physician who is treating him. There is a thin line in 
many cases between the use of therapeutic technique or drug for treatment 
and for institutional advantage. Again, the availability of the ETC will 
help to decide individual cases, using the guidelines as state above. Re- 
search is an exploration of a new technique or medication that has not yet 
been shown to have significant therapeutic efficacy as compared to a cur- 
rently available medical practice or to an inert substance, and the risks 
of this technique or medication are relatively unknown. On the other hand, 
the "competent practice of medicine" should be based upon scientifically 
valid observations that have been detailed in the medical literature . 

13-15 



It should be remembered, however, that a physician is not bound to 
use one specific therapeutic method or drug for a particular disease. 
The physician has the opportunity and the responsibility to select from 
among all generally accepted modes of therapy as long as there is a 
scientific, logical basis for the determination . Moreover, the physi- 
cian cannot guarantee a cure, but only the exercise of his skill, exper- 
ience, and best judgement for the particular patient. It would be un- 
fortunate if rules to insure rights and benefits became impediments to 
personal care and individualized therapy. However, accountability is 
needed and is proper within the contexts of both research and medical 
practice by even the most conscientious physicians. At the same time, 
too many detailed constrictions based on inadequate scientific evidence 
would tend to move most therapeutic techniques or approaches toward the 
average or the mediocre or toward the "accepted and routine practice of 
medicine" which is not always acceptable at the present time. 

Proposed Guidelines for the "Competent and Routine Practice of Medicine " 

"Competent and accepted routine practice of medicine" should utilize 
medical techniques which have been validated by scientific experimentation. 
In addition, the proper and accepted routine practice of medicine should 
include the following information before initiating treatment: 1) diag- 
nosis , symptom profile, and etiology of the disease; 2) course and his- 
tory of the disease; 3) treatment of choice; 4) anticipated beneficial 
effects and side effects of the treatment technique; 5) alternative 
treatment techniques available for the disease; 6) the physician should 

13-16 



should be knowledgeable about the scientific research results concerning 
the treatment techniques that he is applying to the patient and should 
fully inform the patient about the important aspects concerning the side 
effects as well as beneficial effects of the treatment technique; 7) the 
physician should have some concept of the duration of treatment and this 
aspect should also be explained to the patient; and 8) the patient should 
be informed about what alternative treatments are available, if any, if 
the present treatment technique fails or progresses too slowly. 

Concluding Remarks 

Biomedical or behavioral research involving human subjects has been 
defined as well -designed and critical investigations of a therapeutic 
technique with unknown efficacy and risks or an attempt to find the eti- 
ology of a disease having for its aim, the discovery of new facts or the 
revision of the present techniques associated with the "accepted and rou- 
tine practice of medicine" with the ultimate goal of providing beneficial 
effects for human subjects. The latter part of the sentence in Section 
202(a) (1 )(B)(i) which is worded, "the accepted and routine practice of 
medicine" has been changed in this paper to read, "the competent prac- 
tice of medicine that has been validated by scientifically valid experi- 
mentation." Human research shall not include those studies which exclu- 
sively utilize tissue or fluids or other products after their removal or 
withdrawal from a non-pregnant human being. In this manner, an attempt 
has been made to delineate more clearly the proper practice of medicine 
from the proper conduct of research. The author considers "the accepted 



13-17 



and routine practice of medicine" in this country as well as in many other 
countries to be unacceptable in certain situations, and there are many 
physicians whose performance does not always meet reasonable criteria 
of quality. The physician in charge of treatment of the patient should 
be using a treatment modality which has been shown in scientific experi- 
ments to have been more efficacious for the specific disease than compar- 
atively inert treatment techniques or substances. In addition, the phy- 
sician should have a reasonable expectation that the treatment imposed 
on patients who have a questionable understanding of informed consent 
(thus, their legally authorized representative signs consent) will pro- 
duce changes that the patient would seek if he were more rational. Any 
question of the efficacy of the treatment technique or treatment goals 
should be reviewed by the Extraordinary Treatment Committee (ETC). In 
those psychiatric emergencies concerned with patients presenting acutely 
suicidal or homicidal behavior, treatment may be immediately instituted 
on admission of the patient to the hospital, but any question of the 
efficacy of the treatment technique or treatment goals should be reviewed 
by the Extraordinary Treatment Committee within a reasonable period of 
time after treatment has been initiated. It should be emphasized that 
the undue delay of treatment may be harmful for the long-term as well as 
short-term prognosis of the patient. Therefore, if the Extraordinary 
Treatment Committee system is to function for the welfare of the patient, 
several of the key members of the ETC will have to be full -time admini- 
strative staff members who are not employees of the institution or clinic 
where the patient is undergoing treatment. Extraordinary Treatment Com- 
mittees should be available for out-patient community facilities as well 

13-18 



as for institutions. If any treatment technique should lead to serious 
questions as to its safety or efficacy, evidence from the published sci- 
entific literature and from the clinical experience of qualified experts 
should receive substantially greater weight than what is considered to 
be the "accepted and routine practice of medicine" which frequently is 
below the standard that we expect in this country. If the question is 
related to drug use, then the evidence from the scientific literature 
and clinical experience of qualified experts should receive substan- 
tially greater weight than the statements printed in the package in- 
sert and Physicians Desk Reference (PDR). 

I have referred to Dr, R. Levine's July 14, 1975 paper several times 
and would like to state that I would agree with him on most of the major 
points that he raises in his manuscript if the "accepted and routine prac- 
tice of medicine" were adequate. His conceptual models on page 5 would 
be valid if "routine and standard practice of medicine" were deemed to 
be adequate, However, the proper and competent practice of medicine 
should be based upon scientifically validated experimentation or on em- 
pirical knowledge that the presently used mode of treatment is the best 
available technique for the specific disease at this time. In many cases, 
there is no doubt that one can differentiate the intent of the profes- 
sional researcher from the practicing physician. However, it is my opin- 
ion that there are many exceptions to this observation and that in many 
cases it would be impossible to differentiate the innovative and intelli- 
gent physician who is using a standard medication with a slightly differ- 
ent approach for the benefit of the patient from the researcher who is 
attempting to gain new knowledge from the use of the same medication. 

13-19 



Similarly, in some situations it may be very difficult to differentitate 
the intent of the incompetent physician who is using "a standard type of 
treatment" in an inappropriate manner from the incompetent research per- 
son who is performing an ill -designed project in an uninformed patient. 
These are some of the reasons why I reworded Section 202 in my attempt 
to delineate research from what should be the "competent" practice of 
medicine, I strongly agree with Dr. Levine's statement on page 14 that 
some physicians may "proceed with pure practice intent" with an inno- 
vative therapeutic approach after other treatment modalities have failed. 
However , according to the definition in this manuscript and according 
to the present regulations, these intelligent, innovative approaches 
are still considered to be research. Thus, I once again must re-empha- 
size the essential need for an Extraordinary Treatment Committee easily 
accessible for a rapid evaluation of this type of innovative treatment 
approach, thus eliminating a great deal of bureaucratic paper work 
for this particular type of practicing physician. Otherwise, under pre- 
sent regulations, he would be forced not only to write out a detailed 
research protocol but to have it evaluated by an Institutional Review 
Board which may only meet once monthly. This delay of treatment could 
be disastrous for the patient. Thus, the patient would be the main 
individual to suffer under the present system when he has the good 
fortune to be treated by an intelligent, innovative physician. 

It has been previously mentioned in this paper that there are many 
people practicing behavioral therapy, psychotherapy, marital counselling, 
encounter therapy, etc, who do not require licensure by the state in 
which they reside, have not received adequate training, and are not 

13-20 



subject to any legal controls. This situation is ridiculous and must 
be addressed by the NCPHSBBR since these individuals are frequently 
utilizing treatment techniques that are not scientifically grounded 
and are not based upon any scientifically valid experimentation. Thus, 
these individuals are actually performing behavioral research with hu- 
man subjects without any restrictions or controls or guidelines. The 
requirement that such individuals be evaluated by an Extraordinary Treat- 
ment Committee may prove to be of great benefit to a major part of the 
patient population which is now being treated by these individuals. 
There is no doubt that the patient population treated by these unproved 
techniques and unqualified personnel are within the subject population 
that the National Commission for the Protection of Human Subjects of 
Biomedical and Behavioral Research has to report about to the President, 
the Congress, and the Secretary, 

It is apparent that the cost of treatment for mental health will 
increase even more if the Extraordinary Treatment Committees are to be 
effective committees with full-time administrative staff and not just 
rubber stamp committees. However, the possible elimination of ineffec- 
tive and expensive treatments such as psychosurgery and psychoanalysis 
for schizophrenic patients (See P.R.A.: Treatment of Schizophrenia: 
A Comparative Study of Five Treatment Methods, Science House, New York, 
1968) may partially or completely compensate for the additional costs. 
Although it is recognized that it would be impossible for the Extra- 
ordinary Treatment Committees to review or even be aware of all treat- 
ment and research problems, the wery existence of these committees would 
serve as a deterent for the negligent therapist or researcher and would 

13-21 



foster a more cautious, thoughtful attitude in all who are involved in 
research or treatment. 



13-22 



14 

ON THE USEFULNESS OF INTENT FOR DISTINGUISHING BETWEEN RESEARCH AND 
PRACTICE, AND ITS REPLACEMENT BY SOCIAL CONTINGENCY: 
IMPLICATIONS FOR STANDARD AND INNOVATIVE PROCEDURES, 
COERCION AND INFORMED CONSENT, AND FIDUCIARY 
AND CONTRACTUAL RELATIONS 

Israel Goldiamond, Ph.D. 



Advances in biomedical and behavioral research have aroused public 
concern in at least two areas. These are the social implications of the 
advances and the human means necessary to produce them. The present 
discussion centers on' the latter, specifically as it relates to human 
experimental subjects undergoing experimentation and human patients under- 
going treatment. In both cases, there is professional manipulation of 
outcomes, which can contribute to advances. Nevertheless, a commission 
has been established to consider the protection of subjects, rather than 
patients, or than both. 

If there are distinctions between the two areas, as is implied by the 
Commission's mandate, then there are at least three reasons to make them 
explicit. First, such distinction is necessary if the scope of delibera- 
tion by the Commission is to be defined. Second, such distinction will 
tend to curtail expansion into one area^of controls properly directed at 
the other. In legislative terms, in the absence of clear distinctions, 
rulings directed specifically at, say, experimentation, may come to be 
extended to treatment, and rulings which specifically exclude treatment 
may come to exclude experimentation. Third, if meaningful distinction 
is not possible, there may be repercussions far beyond these, given the 
present social climate. Reports of abuse of human subjects have occasioned 
the present scrutiny of the means for sucii abuse, which adhere to 
experimentation. If treatment is indistinguishable from experimentation, 
then the same means for abuse are also inherent in treatment. Accordingly, 
whatever social winds sweep at experimentation will also sweep at treatment. 
Indeed, Senate hearings (Hearings before the Subcommittee on Health, 1973) 
on S. 974, training in "implications of advances in biomedical research and 
technology;" on S.J. Res. 71, evaluation of implications of such advances; 
and S. 878, "provision of restrictions on funds for experimental use" are 

14-1 



published under the title Quality of Health Care -- Human Experimentation 
1973 . In addition to not being immune from incorporation into the question- 
ing of research, the routine and accepted practice of medicine is becoming 
routinely less accepted on its own, as suggested by the rising cost of 
malpractice insurance' and the increasing scrutiny represented by books such 
as The End of Medicine (Carlson, 1973). 

That the distinction between practice and research is not self-evident 
derives in part from the fact that research is often performed in the context 
of treatment: the person who is a patient may at the same time be a 
subject in a biomedical or behavioral experiment. Indeed, much of the 
research upon which advances in treatment often depend can be conducted only 
under such circumstances. Even when practice and research are separated, 
it seems to be generally accepted by reviewers that treatment is often 
indistinguishable from experimentation. Thus, Beecher states that "whenever 
the physician tries out a new drug or a- new technique... he is experi- 
menting in his effort to relieve or cure the individual involved" (1970, 
p. 83) but this is extended to "every medical procedure, no matter how 
Simple or accepted," by Ladimer. Treatment "is an experiment since it is 
applied in a new context each time" (1963, p. 190). F. Moore expands this 
into several experiments in the course of one treatment episode: "Every 
surgical operation is an experiment in bacteriology," he states, and is 
simultaneously an experiment "in the pharmacology of anesthetic drugs ... 
in the conformity to anatomical norms, and often in the biology of malignant 
tumors" (1975, p. 15). Levine's overview is indeed apt: "Even a super- 
ficial exploration ... will reveal the impossibility of describing mutually 
exclusive subsets (one called research and one called practice)" 

(1975a, p. 1). 

14-2 



In both cases, manipulations derive from systematic approaches; the 
intervention procedures used and the results obtained are recorded; these 
are evaluated in terms of baselines, basal measures, or other norms; the 
interventions are subject to change depending on their outcomes. Other 
similarities exist. Given the social importance of distinguishing the two 
subsets, and given the overlap between observable behaviors, the use of 
a subjective unobservable to distinguish the two is understandable. The 
history of psychology is replete with the introduction of such terms to 
distinguish between processes which it is important to separate, but 
which the verbal-observational system in use does not permit. (As will be 
noted, the history of psychology also reports correctives.) In this case, 
the "taxonomic" function is assigned to intent . Thus, regardless of overlap 
between procedures described, they are classified as treatment where there 
is "therapeutic intent," and as experimentation when the professional's 
"motive is indirect benefit to society, not benefit to the patient" 
(Blumgart, 1969, p. 252). And this holds even if the patient benefits 
thereby; conversely, if the professional "believes (even if only on the 
basis of advertising) that [the treatment] will do the patient good, then 
he is acting as a physician," presumably even if it does him no good (Edsall, 
1969, p. 466). The general opinion is summarized by Levine: "If a 
physician proceeds in his interaction with a patient to bring what he con- 
siders to be the best available techniques and technology to bear on the 
problems of that patient with the intent pf doing the most possible good 
for that patient, this may be considered the pure practice of medicine." 
(1975a p. 6). He reports a second system of classification, namely, group 
acceptance or approval, presumably of a particular procedure as treatment. 
The two systems can conflict, as when a physician uses a new drug with the 
intent to doing the most possible good for the patient, while this drug 



14-3 



has not yet been approved for "safe use" in such cases by a procedure- 
accrediting group -- here, the Food and Drug Administration (1975a, p. 11). 
Intent would then be overridden. In such situations, research would be 
defined by efforts deriving from an intent to distinguish between classes 
of patients for whom a treatment should be approved or disapproved, since 
the intent is to provide generally useful information. Treatment would be 
restricted to the use of the procedure, when approved, with the intent of 
doing the most possible good for a particular patient. 

Undoubtedly, there are differences in intent when research or treat- 
ment is undertaken, and subjects and patients do have different expectations. 
While these differences may be along the lines noted, it would seem that 
intent is a rather slender reed upon which to build public policy, 
especially where issues as important as those noted rest upon this platform. 
That intent is used in its subjective sense is made clear by Levine's 

quotations from the dictionary, e.g., "the state of mind or mental 

« 
attitude with which an act is done" (1975b, p. 2a). The question arises 

of how one ascertains intent or, more properly, ascertains individuals' 
"state of mind or mental attitude" in the performance of their acts, or in their 
"concentra[tion] on some end or purpose" (ibid). The definition of someone's 
intent through consensus by experts is no more valid than such assignment 
by a single person and, ever since Freud, at least, we have learned to 
question even self-assignment of intent, no matter how sincerely or 
tenaciously held. 

Subjective terms such as intent, expectation, desire, motive cluster 
around a common core close to the subjective dictionary definition noted. 
They may be used in several ways, among which are the following. (1) Sub- 
jective : The terms are used with reference to this common cluster. Specifi- 
cally, research and treatment are distinguished by differences in intent and 

14-4 



expectations (Ladimer, 1963, p. 192). This usage imposes the validational 
difficulty noted, with its attendant problems for social policy. (2) Indi- 
cator : The subjective terms may be considered as indicated by clearly 
stated relations between explicit sets of procedures, called indicators . 
The indicators do not" define the subjective processes, which are independent 
of them. Specifically, the different monetary exchanges in research 
(professional pays subject) and treatment (patient pays professional) stem 
from differences in intent; they may indicate the existence of such differences 
but do not define them (Levine, 1975b, p. 8a). Although the indicators may 
be readily defined, the validational difficulty of the referent remains, 
as do the social consequences noted. (3) Operational : Terms with an 
originally subjective meaning may be used as a metaphor or simply as a con- 
venient label for clearly stated relations between explicit sets of procedures, 
which define the terms. Specifically, research intent is defined by certain 
stipulated procedures, and treatment intent by yet others. The terms have 
no other properties. This is the most familiar form of the operational 
definition. It couples clarity and ready validation with what is often the 
exclusion of the area of concern. (4) Operant contingency : The social 
importance attached to subjective distinctions may be considered as repre- 
senting important differences in social and personal consequences which are 
contingent on the behaviors which are occasioned by the systems discussed. 
Specifically, if differences in intent are consistently used to separate 
research and treatment, this may derive from important differences in the 
social and personal consequences contingent on behavior in the two institu- 
tions. 

Overlap between many of the behaviors in the systems necessitates the 
introduction of a classification system other than behavior. This can be 

14-5 



intent which, unfortunately, leads to valiciational problems, since it is 
unobservable. However, the alternative classification system can also be 
the operant (as opposed to operational, cf. 3 J. Moore, 1975) contingency, 
which does not define terms simply by the behaviors, but also by their 
relation to the consequences differently contingent on them in the two 
settings. These, too, are observable and can be validated. They fulfill 
the same logical necessity to which subjective intent is addressed, and 
may serve the same social functions. The system of analysis, however, 
is not as familiar as the others, nor has it been used as extensively in 
discussions of social issues. Accordingly, it can not be referred to as 
readily, nor stated as simply. The simplest statement, of course, is 
intent. However, the complexities and difficulties encountered when one 
tries to apply it meaningfully to matters of social policy suggest that 
the verbal simplicity provides little help in systematizing the issues to 
which it is addressed. This drawback is also encountered in subjective 
definitions of consent (i.e., did the person really understand?) and the 
coercion which jeopardizes its legal acceptance. 

This discussion is addressed to the problem of making explicit the 
social and personal contingencies to which terms such as intent, coercion, 
and consent are addressed, in the context of distinguishing research from 
treatment and, therefore, of distinguishing human subjects of biomedical 
and behavioral research from human patients of biomedical and behavioral 
treatment. In the process, I shall note ancillary issues such as the 
different types of contractual relations involved, as well as some 
assumptions on which these are b?sed. 

The discussion will open with a brief exposition of the analytic 
system, its commonalities with cognate systems in the social sciences and 
in law. I shall examine a legal use of intent as a taxonomic device to apply 
differential treatments, for the clues it contributes to this discussion. 

14-6 



I. SOCIAL CONTINGENCIES AND LEGAL INTENT 

The opening discussion of operant contingencies will be confined to 
that which is necessary for the later presentation. 

The "three-term" formulation of an operant contingency requires that 
at least the following elements be specified: (1) the occasions upon which 
(2) consequences are contingent (3) on behavior (cf. Skinner, 1969, p. 7). 
The term contingency refers to the fact that unless the behaviors occur, 
the consequences will not follow. Another way of stating this is that the 
behavior is required (if the consequence is to occur) or is a requirement 
(for its occurrence). The consequence, however, need not follow every 
behavior occurrence: a fixed or variable numberof responses, or a period 
of no behavior may be required, among others. The event in (1) may be 
said to occasion the behavior or provide the opportunity for it. Presented 
in order of appearance, the contingency is described as (1) occasion, 
(2) behavior, (3) consequence. 

Where, given the occasion-behavior-consequence contingency, the 
behavior increases in likelihood when the appropriate occasion occurs, a 
reinforcemen t contingency is defined. In positive reinforcement, the 
behavior-increasing consequence is the presentation of an event. In 
negative reinforcement, the behavior-increasing consequence is the post- 
ponement (avoidance) or elimination of an event (escape). Given occasion- 
behavior-consequence relations, and the behavior decreases in likelihood, 
a punishment contingency is defined. Punishment can involve postponement 
or elimination of an event (typically, one whose presentation is positively 
reinforcing), or it can involve presentation of an event (typically the 
events whose withdrawal is negatively reinforcing). 

It will be noted that whether the contingency is defined as reinforce- 
ment or as punishment depends on whether or not behavior was increased or 

14-7 



attenuated, respectively, and not upon the intent of the wielder. A parent 
who intends to stop a child's annoying behavior or to prevent its recurrence, 
and behaves in a manner judged by self and others to be punitive, will be 
defined as having instituted a reinforcement contingency -- if there was 
an ensuing increase in behavior. If the behavior did indeed cease, this 
outcome might then reinforce the parent's punitive behavior on those 
occasions when the child misbehaves. Being punitive is the requirement for 
obtaining relief. 

One last point will be made. Whether or not presentation of a conse- 
quence will affect behavior will depend on antecedent conditions which must 
be specified. Whether food can reinforce behavior depends on the organism's 
degree of deprivation, upon the cultural definition of that food as permiss- 
ible or forbidden, among others. Further, events may acquire reinforcing or 
punitive properties through their relation to other events. Where the 
behavior required for reinforcement is r an extended sequence of interactions 
with the environment, each component link in that chain may be considered 
as an occasion-behavior-consequence link. This consequence derives its 
reinforcing property from its progressive relation to that consequence for 
which the whole sequence is required. 

The formulations may be used to analyze social relations, and the pro- 
cedures developed may be used to change them. When one person is engaged 
in extended interaction with another or with a system, the behaviors of each 
may be viewed as occasions and consequences which bracket the behaviors of the other, 
Each consequence may derive its reinforcing properties from its relation to 
a consequence at the end of the chain-requirement, or for other reasons. 

The relation can be considered in terms of gains for each. The 
advantage can be considered positive, e.g., obtaining something valued, or 
negative, e.g., obtaining relief from distress. The relationship can be 

14-8 



described in terms borrowed from the market-place: there are transactions 
involved, with one person's behavior providing the other with something 
valued, and the other providing something valued in return. In its original 
usage (before its corruption by psychotherapists), transactional analysis 
referred to such relationships, often involving extended verbal intercourse. 
The descriptive metaphor may be a barter system, with exchange theory being 

the model. Decision theory may be viewed as a related development. A 

■ 

decision requires at least two well-defined sets of behavior, which inter- 
sect with at least two states of the environment. A 2 x 2 matrix is 
thereby defined, with the entry in each being the consequence of that 
behavior under the particular environmental occasion. All four consequences 
must be considered, in accord with some decision rule, and the analysis 
often consists of ascertaining which decision rule rationalizes the empirical 
data obtained, that is, which provides the best fit. It will be noted that 
where the states of the environment, present or future, are unknown, there 
is risk attached to either behavior, since the consequence may or may not 
be a gain, depending on state of the environment. Cost-benefit analysis also 
considers the consequences which are contingent on behavior, but in contrast 
to the decision model presented, in which either of two consequences is 
contingent on behavior (depending on the occasion), in cost-benefit analysis, 
at least two consequences are often both attached to the same behavior. 

Each of these models covers overlapping terrain, and also considers 
variables not considered by the others. Differences in metaphors, that is, 
the languages they use and the concepts they relate these to, as well as 
differences in variables considered derive from the different requirements 
of the academic disciplines, e.g., transactional analysis in antnropology , 
exchange analysis in sociology, decision theory in economics, and operant 

contingency analysis in the conditioning laboratory, from whose requirements 

14-9 



much of the terminology and procedures derive. Differences in terminology 
and metaphors have tended to restrict communication between models. Where 
a model has been applied to a discipline other than its origin, it has often 
led to bursts of progress (e.g., decision theory applied to perception and 
clinical decisions), since it contributes procedures which are new to the 
adopting discipline. Although the language has often been subjective, e.g., 
participants have expectations, they make decisions, they hope or intend 
to optimize net gain, what makes the adoption useful is the procedures for 
analysis it provides. I shall consider the relevance of such procedural 
analysis for analysis of legal intent. 

It would be surprising if the legal system, faced with decisions which 
have social consequences, had not come up with similar procedures. Where 
power over life, liberty, and property is involved, the consequences of 
definitions in terms which are open to a variety of interpretations in 
practice, and in terms which are quite specific and limited, can be markedly 
different. For example, Currie (1968) attributes differences in the number 
of witches executed in Renaissance Europe on the Continent (500,000 estimated 
executions) and in England (less than 200) to differences in the stringency 
of the definitions of witchcraft applied by the different legal systems, 
and to the different consequences of conviction to the accusing system. 
Intent, as noted, is a difficult term to define. I shall consider its legal 
use in mens rea , or criminal intent, specifically with regard to intent to 
commit murder. 

Wexler, a legal scholar, notes that "the law is ripe for contingency 
analysis" (1975, p. 174) and that such analysis "can help to clarify the 
definitional and evidentiary aspects of hazy and imprecise legal concepts" 
(p. 175). He also notes that previous attempts "to purge the 
penal law of the concept of mens rea ('criminal intent') ran head-on into 

14-10 



numerous obstacles and objections" (p. 175). However, as was discussed, 
there is a difference between the operational definitions associated with 
classical behaviorism and the operant contingency definitions associated 
with radical behaviorism (Skinner, 1974). 

Two types of contingencies will be noted which are related to the 
statement that someone "did willfully and knowingly intend" to commit 
murder and then carried out his intent. The first contingency to be dis- 
cussed defines the intent which distinguishes first degree murder. The 
second defines the social consequences contingent on differentiation of 
murder by intent and other types of killing. 

1. Intent defined. Three things are involved here: motive, opportunity, 
and means. 

Motive is defined by the consequences of the act. A victim is found 
dead in Trenton with a bullet hole through his head. If it turns out that 
a nephew is bequeathed $50 million as er-result, the nephew is considered 
as having a motive. The French maxim, "Cherchez la femme" suggests a pre- 
vailing consequence (motive) in that society. 

Opportunity. This is where the alibi enters. If the nephew was in San 
Francisco at the time, he may not be as likely a suspect as if he had been 
in Trenton, in the neighborhood of the crime, at the time. He will then 
be a suspect. 

Means. The nephew has recently purchased a carbine, has practiced, 
and the murder bullet was .30 caliber; the nephew reports that the rifle 
had been stolen the week before. 

The nephew is the prime target, and the state will make every effort 
to demonstrate that the means was probable behavior. He may be indicted and, 
despite his strenuous denials, a jury of his peers may find him guilty of 
murder with intent, that is, first degree murder. 

14-11 



It will be noted that the three-way operant contingency discussed 
earlier is considered to be present: opportunity, consequence, behavior. 
Intent is thereby defined. 

2. Social necessity. If the uncle is killed in what appears to be a 
traffic accident, and the driver had no motive, the law will treat this 
differently. If, in addition, the driver had exceeded the speed-limit, 
the law will treat this yet differently. If, in addition, the driver was 
fleeing the scene of a robbery he had committed, this will be considered 
the equivalent of first degree murder. To the immediate family, the results 
are operationally the same: they have lost a beloved member of the family. 
He is just as dead in each case, including the murder case. The law will 
not bring him back, yet it treats the killings differently. 

On (a) the occasions of the offenses cited (b) the consequences for 
society (c) of classifying the offenses in actionable categories must be 
considered in accord with a particulars-social policy. Inspection of the 
offenses, classes established, and social consequences suggests what the 
policy may be. With regard to the intent-to-kill contingency discussed, 
societies apparently abound with people whose elimination would be useful 
to other people. Societies also abound with earnings which may be obtained 
by theft and other felonious behaviors. Both the temptations and behaviors 
which yield to them are prevalent. In addition, the behaviors are amenable 
to social control. Accordingly, the law intervenes to decrease the likeli- 
hood of these behaviors by threatening its most drastic punishment, and 
applies the general term "first degree" killing. However, to paraphrase 
La Place's maxim on the improbable, accidents allow themselves the luxury 
of occurring. No legal sanctions can prevent them from occurring, so the 
law will not apply its deterrent. A component of social policy may be 
inferred from the discussion, namely, that severity of consequence be 

14-12 



directly proportional to its efficacy in decreasing the likelihood of the 
offense. The more effective the punishment on behavior, the more severe it 
should be. Another component of social policy may be inferred from the 
different punishments attached to killing when the speed limit was exceeded 
or when a felony was committed. Both speeding and felonies may be amenable 
to control by social deterrents, but the offenses differ in a variety of ways, 
including prevalence, and the likelihood of general damage to the social 
fabric. The presence of yet a different component is suggested by lex 
talionis (e.g., a life for a life), whereby the severity of the legal con- 
sequence is governed by the general severity of the offense. Here, all types 
of killing might be treated similarly. 

No pretense is made that the discussion is exhaustive; the writer is a 
legal layman. Nevertheless, the two contingencies presented suggest that 
legal resolution, although often couched in subjective terms such as intent 
(coercion and consent will be considered later), is amenable to contingency 
analysis and possibly was formulated in accord. It was noted earlier that 
various social disciplines have almost independently developed forms of 
contingency analysis and there is no reason to assume that this is not the 
case for law. It is of interest that decision theory, a system of complex 
contingency analyses, employs, as does the law, subjective metaphors to 
label its components, e.g., a decision is made, a strategy is followed, 
it may be governed by its expectations. The terms, however, are names 
for explicit procedures and explicit formal (mathematical) relations between 
procedures and data. The bases for classification are the observables 
and their relations , and not the subjec t ive designations given them , nor, for 
that matter the dictionary definitions of the designations. 

Nor should it be assumed that the contingencies presented are those 
which actually occur. Only a careful fine-grain analysis of the actual 
workings of each system can indicate what contingencies are actually operating 

14-13 



in that system, as opposed to those which "should be," as defined ethically 
or as stipulated by its empowering group or by its own members. The con- 
tingencies presented are purely heuristic, and serve to suggest some necessary 
considerations for social definition. 
Contingencies of classification of social activity . 

Assuming that contingencies are employed in classification (if human 
behavior is under consideration, since it is sensitive to influence by con- 
sequences, such contingency analysis is suggested), the discussion suggests 
that at least two social contingencies are required. One is the particular 
contingency which defines the class to be treated. The other contingency 
governs the specification of a classificatory scheme, whereby the first 
contingency is distinguished from others in the scheme. 

A variety of classificatory schemes can be proposed, each of which 
can be stated as a contingency. The social policy which affects the choice 
of one rather than the other should be. made explicit. A parallel is found in 
decision theory where, for the same sets of contingencies, different decision 
criteria or decision goals, are offered (e.g., minimax, maximin, Neyman- 
Pearson criteria) which set different types of outcomes as acceptable, and 
thereby require different policies, or strategies of choices. 

Decision theory may be employed normatively, that is, to suggest 
strategies which accord with the policy, e.g., if average losses are to be 
kept below a certain level (minimax), a specified strategy should be followed. 

Decision theory may also be employed descriptively. For the actual 
choices and their consequences, the question may be raised as to which 
decision criterion best rationalizes the data, that is, v/hich best, fits the 
data. This postdiction may then be validated by prediction of experimental 
or other research outcomes. It should be noted that it is not necessary 
to assume that the choices were governed by rational intent. Animals have 

14-14 



been excellent subjects for decision research. The decision criterion which 
rationalizes the data is the one which makes the most sense to the analyst, 
not the "decision maker". 

Finally, a discrepancy between socially normative criteria and descrip- 
tively inferred criteria may be used to orient programs of change. Indeed, as 
Gray (1975) concludes, "relatively little consideration has been given to 
mechanisms or procedures that might help assure that the ideals are 
achieved" (p. 245). He notes that an institution may set up peer review 
committees only because consequences such as protection of the institution and 
a continued flow of research funds are contingent on such behavior. Further, 
the very review procedures chosen may be those whose conseqeunces are simply 
to " appea r to meet the official goal " (1975, p. 46, original emphasis). 

Decision theory specifies its requirements, procedures, and outcomes 
in explicit terms which are related mathematically and are often so defined. 
Obviously, all of these can not be met -- what quantity do we assign a human 
right or an iatrogenic dysfunction (even if a jury does)? Nevertheless, it 
may be worthwhile to specify those classes of observations and relations which 
the theory requires, and consider them explicitly, for policy formulation. 

Contingency analysis, as used in decision theory and in operant behavior 
analysis, would appear to be useful in consideration of social issues and 
policy. We shall now consider such definitions of treatment and research. 

II. TREATMENT AMD RESEARCH 

The first two terms of the three-term contingency which specifically 
define treatment and research will be considered together since (a) the 
occasions and (b) the consequences (which will then be contingent on behavior) 
are defined in terms of each other in a manner to be noted, the third element, 
(c) the behaviors then required, will be considered separately. The 
different contingencies for patients and subjects and for their corresponding 

14-15 



professionals will be noted in a separate section which will also consider 
the means-ends differences often assumed to distinguish patients from subjects. 

Discussion of the social contingencies and policy which specify a 
particular classificatory scheme will be dispersed throughout and accordingly 
will not be restricted to a separate section. 

Occasions and consequences in the social definitions of treatment and 
research . 

There are interesting parallels between the occasion-consequences 
relations of the treatment and research systems. These parallels are along 
lines other than patients and subjects. 

In the various treatment systems, the events which occasion treatment 
are individuals (collectives may be considered as such) who present functioning 
which is less than adequate or which poses problems, and the consequences 
which maintain treatment are progress toward, and it is hoped, production 
of functioning which is more adequate than before, for the same individuals. 
The individual units can be humans who are designated as patients going 
through a clinical system, as students through an educational system, as 
trainees through a training system, and so on. The units can be animals going 
through clinical or training systems. The units can also be automobiles or 
electrical appliances going through their repair systems. The transmutations 
in functioning may be designated in terms such as correction, enhancement, 
innovation, limitation, repair, restoration, and treatment, among others. 

In the various research systems, the events which occasion research 
are somewhat systematized and organized statements or related problems, and 
the consequences which maintain research are progress toward and, it is hoped, 
better organized statements. The criteria used to evaluate the organization 
include, among other things, changes in consistency, parsimony, coverage 
and, for those empirical systems we call scientific, validation by prediction 
or control. The transmutations along these lines may, like treatment, 

14-16 



be designated as correction, enhancement (extension), innovation, limitation, 
repair, restoration, and treatment, among others. 

The changes attributed to the two systems may be described as the 
positive reinforcers of functioning, healthy, or educated individuals in the 
treatment systems and of better-systematized statements or new knowledge 
in the research systems. The changes attributed to the two systems may also 
be described as the negative reinforcers of relief from distress or ignorance. 
Although these consequences whether viewed "constructionally" or "patho- 
logically" (Goldiamond, 1974) are not always produced by the social 
institutions (n.b., school ineffectiveness), they are considered to be 
contingent upon their proper functioning, and the consequences (no matter 
how variable) therefore maintain social support of the institutions. The 
support can be financial, as in research, or partly financial and partly also 
in the granting of virtual state monopoly, as in the school systems and 
medical licensing systems. 

This cursory analysis suggests that in the clinical treatment enter- 
prise and in the biomedical-behavioral research enterprise, the patient and 
the systematic formulation ("Nature") are analogous. The human patient and 
the human research subject are not analogous in considerations of the two 
enterprises as enterprises. 
Behaviors in the contingencies defining treatment and research . 

Whereas the differences between occasion-consequences in treatment, 
and occasions-consequences, in research seem clear, there is considerable 
confusion in the literature on differences between the third terms of the 
contingency, namely, behavior. As was noted in the introduction, "every 
medical procedure, no matter how simple or accepted" is considered to be 
"an experiment since it is applied in a new context each time" (Ladimer, 
1963, p. 190). Since the outcome is never certain, "all or nearly all therapy is 

14-17 



experimental" in this sense (Beecher, 1970, p. 94; cf. Freund, 1969, p. viii). 

Where there is uncertainty of outcome, the effort must be considered 
as a trial or as an attempt whose outcome is to be related to the trial 
to produce a type of knowledge or inference which is never certain, 
is fallible, and is therefore subject to change. When one contrasts the 
certainty of the a priori knowledge which derives from faith, the classical 
distinction between the a posteriori knowledge derived from experience and that 
derived from faith is evident. Indeed, the French word for experiment is 
experience , defined in my Larousse Petit dictionnaire (1936) as "n.f. Essai , 
epreuve . Connaissance acquise par la pratique , par 1 'observation " as dis- 
tinguished from knowledge gained through faith. Its specific meaning is 
" Particul . Essais , operations pour demontrer ou verifier une chose . " The 
same term catches the common tentative quality of what English separates 
as experience and experiment. Indeed, to experiment is given by " experimenter , 
v. tr . Eprouver par des e xperiences ." The terms were not always separated in 
English. The OED reports that in 1382, Wyclif's translation of Genesis xiii, 
15 (Revised Standard Version, 1952, "By this you shall be tested") opens 
"Now y schal take experyment of -sou", but in the 1388 edition, it is "Now 
y schal take experience of -iou." 

Indeed, if this close linkage makes experiments of all experiences 
(both are derived from L. experiri . to try) then not only does all medical 
treatment become biomedical experimentation, as we are told, but all sensory 
experience and knowledge gained thereby becomes experimental. Possibly, 
this is what Moore was leading up to when he noted that "every surgical 
operation is an experiment in bacteriology, .. [in] pharmacology, ... [in] 
anatom[y], [in] biology" (F. Moore, 1975, 15), tor shortly thereafter he 
speaks of "this basic experimental nature of clinical medicine and, indeed , 
of all human intercourse " (p. 16, emphasis added). Since teaching "is applied 

14-18 



in a new context each time," as is serving customers, and conversing, these, 
too, become experimentation with human beings. 

A simple test which distinguishes scientific experimentation from the 
practices of clinical medicine, routine or innovative, of teaching etc., 
would be to apply the principle of concordance, in the form of a simple 
question: Would a group such as the National Science Foundation give research 
grants in bacteriology, pharmacology, anatomy, and biology for "every 
surgical operation", for every classroom session, and so on? If the dis- 
tinction between the scientific usage of experimental and the lay (and pro- 
fessional usage by writers in the field we are discussing) usage of the term, 
and the distinction between experimentation and treatment are not clear to 
any investigator or practitioner who submits a research proposal, they will 
be clear after review. 

What defines research varies with the discipline, the research strategy, 
the review agency or journal, and no definition will therefore be offered here. 
The peer review committees of the various granting agencies and the editorial 
reviewers of scientific journals and agendas of scientific meetings offer 
sufficient definition. Whether or not a particular project is proposed for 
such review, its designation as a research project might depend on an affirmative 
answer to the concordance question, which in this case is put hypothetically, 
and only to define the behavior. 

Whether activity qualifies as acceptable treatment might similarly be 
defined by peer review, in this case weighted toward post-hoc review. If 
scientific review is to be used as an example, "track-records" of each 
practitioner might serve evaluative functions, just as department heads file 
publications of faculty for consideration of tenure and promotion, and just 
as grant review committees require such listings and evaluation of quality. 

14-19 



Where committees are institutional, its members are subject to the 
same contingencies which govern the person under review. Independence is 
preferable. To assert that the public is best protected by having reviewers 
who are outside the specialty and are therefore personally impartial misses 
the point. The critical issue is to ensure independence of contingency con- 
trol. In areas where specialized knowledge is required it becomes all the 
more important to build in independent contingencies since the special interest 
groups being regulated are the ones which possess the special knowledge 
needed to regulate. Indeed, the history of governmental regulatory agencies 
shows that they wind up being run by the groups they are supposed to regulate. 
It should not be assumed that research and treatment will be exceptions. 
Even where the contingencies governing regulator and regulated are separated, 
there can be "deferred bribes", that is, hiring by the regulated once the 
term of the regulator is up. 

The existence of yet a different type of public protection is implied 
by statements such as "doctors (or other professionals) always stick together." 
Where the implied consequence of a coverup of a person or agency is protection 
of a profession or other specialty group, the argument that only such specialists 
have the evaluative skills may be beside the point. The solution in practice 
is to have a review group comprising members of other specialty groups. 
However, this solution of professional impartiality may also miss the point, 
which is to ensure independence of contingency control. 

For research in the context of treatment, if the research is to be 
meaningful it should meet the concordance criterion mentioned. If the 
treatment is to be considered acceptable, it should meet the criteria for 
treatment. Stated otherwise, clinical research should meet both criteria. 

The concordance solution may also apply to a practitioner who, having 
provided acceptable treatment for some time, would now like to go over the 

14-20 



records for their possible contributions to science or general treatment. 
It should be noted that research grants are made for historical and archival 
analysis, and the research concordance principle would apply to the procedures 
for analysis, the records available, and so on. If types of patients 
(students, etc) and types of treatments selected allow comparisons and 
facilitate research, the use of intent as a taxonomic device poses a pro- 
blem, since it may be inferred that choices for treatment were governed 
by the "intent of developing new knowledge" (Levine, 1975a, p. 6), that is, 
of research. The procedures are, after all, in concordance with research. 
If the treatment provided was concordant with treatment, it also meets this 
test. Selection of patients and treatments is also concordant with treat- 
ment, as evident by professional specializations in both patients and 
treatments; economic and other selection criteria ("I can't treat that 
type") abound. Using a particular type of procedure for a particular type 
of patient is, after all, what diagnoses is about. And if the particular 
patient-treatment interactions are treatment-concordant, the fact that they 
are also research-concordant may be the concern of the research review 
committee. 

In all events, now that treatment is coming under public scrutiny, 
treatment systems might profitably examine the procedures developed by 
cognate systems governed by similar contingencies, namely, scientific 
research systems whose major funding has come from the same public sources 
that will be increasingly tapped for treatment, with the same requirements 
for accountability. 
E ffects on innova t ion and the ac cepted practice of medicine 

The fact that innovative treatments or treatments in new contexts 
are defined as experimental (cf. Beecher, 1S70; Freund, 1969; Ladimer, 1953; 

14-21 



McDermott, 1975; F.D. Moore, 1975) is of concern to lexicographers and will 
not be pursued further here. New procedures and new conditions can be con- 
cordant with treatment and, when so used, Freund sees "no quarrel" (1959, 
p. 317). Our concern will be with the testing of innovative treatments, 
which may fit the research contingency noted, although review committees 
tend to regard such proposals as "demonstration proposals" rather than 
"research proposals". Since innovation may be defined as a departure from 
"the routine and accepted practice of medicine", henceforth to be abbreviated 
raapo medicine, we shall also discuss raapo medicine when implications of 
innovation apply here as well. Research and treatment contexts will be 
treated separately. 

If innovations are not to be accepted until it is demonstrated that thf 
gains are worth the "risks", an issue that immediately arises is our satis- 
faction with raapo treatment. Are the gains worth the "risks" here? And 
how do they compare with innovation? Or do we apply a grandfather clause 
to raapo treatment? The issue, Robbins notes, "not only applies to procedures 
that are developmental or experimental but also to many procedures that 
are considered established and about which questions of risk are no longer 
raised" (1975, p. 4). And Eisenberg notes that the requirements for therapeutic 
trials may be standards of "safety and efficiency beyond those that can be 
offered for the best of medical practice" (1975, p. 96). With regard to 
raapo medicine, he cites the case of Benjamin Rush, who is considered to be 
one of the fathers of American medicine. During the plague of 1793, he 
remained at his post in Philadelphia, ministering to the stricken, instead of 
joining most of his colleagues in their escape to the country: 

"Messianic in his zeal for purging and blood-letting, 
therapeuti c maneuv ers based on contemp orary author- 
ity, he went from home to pi ague- ridden home, 
causing more carnage than the disease itself. Good 

14-22 



intention ... provided no substitute for knowledge then, 
nor ... now" (1975, p. 96; emphasis added). 
And Beecher notes that "a number of examples come to mind to suggest the 
need for healthy skepticism as to how readily established a standard may be," 
(1970, p. 92). 

In discussing private and public good and harm, over short and long 
run, Barber suggests that "a rough functional calculus" be applied which 
"shows some definite net advantage all around" (1967, p. 100). What he is 
proposing has some elements of a decision approach. Some optimization 
criterion is to be applied to a 2 x 2 matrix, whose columns are private and 
public and whose rows are short and long run, with specific consequences in 
the cells. I am proposing that we begin considering the application of 
formal decision theory to the assessment of innovative approaches, since 
these are, after all, social decisions. 

The decision criterion to be applied must be specified. Claude 
Bernard's implied criterion of no "ill to one's neighbor" is moderated by 
Beecher's "shades of gray" (quoted in Barber, p. 98). The decision criterion 
would be applied to a matrix whose columns are types of treatment and whose 
rows may be that which the treatments are to be applied to. These may be 
different diagnoses, or different assumed stages of an illness. In cancer 
research, for example, chemotherapy and radiation might be applied to cases 
where the probability of metastasis was>.2 and <".2, and all four empirically 
obtained effects (entries in the cells of the matrix) might help obtain 
comparative "expected values" (a decision criterion) of these two (or more) 
treatments for these probabilities. Similar matrices might be applied for 
other probability "It-vels. No ready prescription is offered for the row 
entries, nor are the possibilities exhausted. 

Outcomes need not be restricted to gains and losses, or benefits 

14-23 



^nd damages. Elsewhere (Goldiamond, 1974) I have noted that two treatments 
which equally control self-damage (physical constraints and occasional slaps 
upon head-banging by an autistic child), may have different effects on what 
new behaviors may be taught (none in raapo constraint, and progress toward 
developmental norms in- behavior modification), and protection of civil 
liberties and right to treatment might also be considered (Goldiamond, 1975b). 
A matrix was offered to rationalize the tendency to overdiagnose and undertreat 
found in some psychiatric hospitals (Goldiamond, 1974). 

What is being proposed is that the evaluation of benefits and damages 
of an innovative procedure never be assessed purely in terms such as how 
much damage are we willing to tolerate for how much benefit, that is, in 
terms of effects of the procedure alone, but that comparison with the benefits 
and damages of raapo treatment be the routine strategy. Formal decision 
theory minimally requires a 2 x 2 matrix, and a decision is not defined in 
terms of weighing alternative outcomes pf simply one course of action. 
Ordinarily, it would seem that a control group provides such a possibility, 
but I am suggesting that raapo treatment be that control, or one of two 
controls. This might give a 3 x 2 matrix, with the columns being innovative 
treatment, raapo treatment, placebo. 

Where the "expected outcome" data are available for raapo treatment, 
such data would be useful in comparing projections from innovative treatment 
as results are obtained. Where several types of treatment had been used, a 
historical analysis might supply cell entries which would be useful in 
establishing "expected values" of the treatments for different conditions. 
It should be noted that it is possible to construct such matrices only to 
the extent that the requirements of decision analysis (implicitly or 
explicitly) entered into data collection procedures. Where there are no 
data even approximating this requirement for raapo treatment, one might 

14-24 



question the bases for having accepted or continuing to accept this treatment 
as standard, and question whether it should be used as a standard against 
which innovation is to be measured. 

The use of raapo treatment as a standard for defining innovation (that 
which deviates from raapo treatment) is carried to a logical conclusion when 
Levine extends this definition of innovation to the social sciences, namely, 
as that "which differs in any way from customary medical (or other professional) 
practice" (1975a, p. 24). The innovations would thus require all sorts of 
protections not provided in raapo social discipline. One example given of a 
parallel to the investigator-doctor role confusion is a criminologist-law- 
enforcement officer. But suppose some highly undesirable hole (solitary 
confinement) is raapo prison treatment, as indeed is the case (In one prison 
in Illinois a cubicle within a cube within a cube is standard), and suppose 
a warden-penologist wishes to see if such treatment is necessary ( a general 
statement) and for half the prisoners £o consigned, converts the cubicle to 
a larger room, provides options, and so on. He records differences between 
the two situations. Would we require the imposition of informed consent and 
all the other safeguards for this deviation from "customary [penal] practice", 
when they were not required for the standard procedures? A decision matrix 
might prove quite useful (procedures x assumed severity of offense) in 
convincing the outside world to adopt the change, or to whom to apply it. 

All of the foregoing may be summarized by a common expression, when 
innovative treatments are assessed, comparative raapo treatments should be 
"up for grabs." By this process, raapo treatments might gradually be 
clarified as innovations progress. 

This maxim should not hold where the treatment practices of a practitioner 
are under scrutiny, since the practitioner should not be faulted for what 
was then not known. Thus raapo treatment would remain as the safeguard it 

14-25 



has been for the practitioner who uses it, but would lose this position in 
the evaluation of innovative treatment. The two functions would be separated. 

Separating the evaluative (research or demonstration) and treatment 
functions provides safeguards for the practitioner of raapo treatment. But 
what of the practitioner of innovative treatment? Given the uncertain nature 
of raapo treatment outcomes, and given the fact that research is not the only 
avenue to discovery, and that treatment may also provide such an avenue, the 
social and personal stakes in innovative treatment are high. I submit that 
the principle of concordance also extends to innovative treatment. Here, it 
is treatment concordance which is involved. With regard to analogous raapo 
treatment , whatever consent procedures obtain; whatever degree of prior 
specification of procedures and alternatives is required; whatever degree of 
evidence of effectiveness and evaluation in terms of cost of treatment, dura- 
tion, and possible harm are required; whatever proscription holds against use 
of an explicitly designated procedure until it is evaluated further; whatever 
degree of post-hoc review is required, -- these might also be required in inno- 
vative treatment. In addition to protecting the social and personal stake in 
innovative treatment, such treatment concordance might also protect the patients 
(clients, students, etc.) at least as well as they are now protected by the 
analogous raapo treatments. Where such concordance exists, the fact that 
innovative treatments differ from raapo treatments should concern neither type 
of practitioner -- until innovative and raapo treatments are evaluated. As 
was suggested, evaluation of innovation would routinely call for simultaneous 
and comparative evaluation of analogous raapo treatment. 

The social and personal ends (consequences) contingent or innovation 
and research are not served by confusing them, and are best protected by clear 
definitions and distinctions between them. That innovation (discovery) is 
not congruent with science was discussed in a philosophic context by 

14-26 



Reichenbach (1951), who distinguished between the context of discovery and 
the context of justification (p. 231). It is a particular set of formulations 
of the latter which distinguishes science, and it is "the adequacy of the 
empirical procedures [which] governs the adequacy of the experiment 
and minimally demonstrates the competence of the scientist" (Goldiamond, 
1962, p. 310). What it is that is evaluated in this manner can have been 
suggested to the investigator "by a theoretical issue, by a procedural issue, 
by his own subjective experience, by accident, by mistake, by serendipity, 
or in some other way" (Ibid), including treatment. As was noted, the con- 
tinued confusion between innovative and experimental is of concern to lexico- 
graphers. The formulators of social policy have other concerns. 

Innovation which is governed by scientific contingencies should be 
considered as scientific in concordance with defining criteria of the rele- 
vant scientific communities, and innovation which is governed by treatment 
contingencies should be considered as treatment in concordance with such 
defining criteria of the relevant treatment communities. The concordance 
required for research in the context of treatment is that of both communities 
for the contingencies in their respective domains. The evaluation of inno- 
vative treatment would require evaluation of raapo treatment. Such joint- 
evaluation, since it is governed by scientific contingencies, should meet 
the defining criteria of that community, as well as raapo treatment concor- 
dance for both innovative and raapo treatments unless concordance were 
already there, as in evaluation through historical research. Evaluation 
of different raapo treatments would be similarly considered by both 
communities. 

It would seem that the principle of concordance contributes not only 
to the definition of treatment and research, but also to evaluation of 
innovation and treatment, and to protection of the social and personal stake 

14-27 



in innovation, as well as to the protection of individuals treated thereby. 
III. DIFFERENT CONTINGENCIES GOVERNING PATIENTS, 
SUBJECTS, AND RELEVANT PROFESSIONALS 

In the discussion of occasions-consequences for treatment and research, 
it was noted that the patient and the systematic formulation are analogous, 
but patient and research subjects are not. This implies that patient and 
formulation will be treated with analogous respect (or disrespect) since 
social support for the systems involved may ultimately be contingent on how 
successfully the systems produce their assigned outcomes. This also implies 
that patients and research subjects, since their positions are not analogous, 
will occasion nonanologous professional behaviors in the treatment and 
research enterprises, as enterprise s. The conclusion that the protection 
of patients and subjects requires different types of review procedures is 
accordingly a valid one -- as long as the discussion is confined to the 
enterprises as enterprises. However, .-as will be noted in Section IV, there 
are overriding commonalities in other social contingencies, which dictate a 
different conclusion. 

In treatment, an extended sequence of interactions between patient 
(student) and professional is often required for each. An operant chain is 
thereby described; the link reinforcers derive their reinforcing properties 
from their progressive relation to those consequences for which the whole 
sequence is required. On a day-to-day basis, the practitioner's treatment 
efforts along certain lines are reinforced or weakened by ensuing changes 
(depending on direction) of the patient, these then occasion further efforts 
on the practitioner's part, these are then strengthened or weakened, and so 
on. The three-term contingency is clearly evident. In this interactive 
arrangement, the patient's outcomes control the professional's behavior, 

14-28 



providing both occasions and maintaining consequences for it. The patient's 
behaviors are reciprocal: the presentation of complaints and reports of 
relief are patient behaviors which are the occasioning and reinforcing stimuli 
which bracket the practitioner's behaviors. These patient behaviors, as 
well as compliance with other "orders" (the "patient role") are maintained 
by the same consequences which maintain the practitioner's behaviors, namely, 
their progressive approach to the outcome which maintains the entire sequence. 
Thus the ( patient-practitioner ) "mutuality of outcomes" which is used to des- 
cribe the terminal outcome of "successful practice" also applies to the 
links in the sequential chain. There is not only mutuality of outcomes 
but reciprocity of behaviors. As Parsons observes, "each participant 
receives in the short run a quo for the quid that he contributes" (1969, 
p. 338). It should also be noted that a third party enters into this mutuality. 
It is the social system, for whom this outcome is also meaningful, and to 
obtain which it supports the treatment system. 

In experimental research, investigators are engaged in an extended 
sequence of interactions with their data. In operant and related single- 
organism research, the investigator's manipulations along certain lines are 
strengthened or weakened by ensuing changes (depending on direction) in the 
dependent variable, these then occasion further manipulations on the investi- 
gator's part, these are then strengthened or weakened, and so on. The three- 
term contingency is clearly evident. The orderliness of the data controls 
the investigator's behavior, providing both occasions and maintaining con- 
sequences for it. In most research using statistical inference , this pro- 
gressive control by increasing orderliness is evident in a series of experi- 
ments, by one or several investigators. Ensuing experiments are governed by 
outcomes of the preceding ones. The outcome which maintains the sequence of 
investigator-behaviors in a single-organism operant investigation, or in a 

14-29 



series of statistical studies, is increased orderlii.ess or systematizacion of 
statements. The third party here is the granting agency, for whom this 
outcome is also meaningful, and to obtain which it supports the research 
system. 

Since the patient's outcomes control the practitioner's behavior, and 
the experiment's outcomes control the investigator's behavior, it can be 
said that the patients control the practitioner, and the "data control the 
experimenter." Indeed, the patient pays the practitioner, who is thus clearly 
identified as the agent of the patient. In the case of research, it is the 
social system, through its granting agency, that pays investigators. They are 
thereby the agents of the granting agency. They write reports for it, agree 
to provide time for it, and so on. The mutuality of outcomes and reciprocity 
of behaviors which characterize relations between patients and practitioners 
in treatment, also characterize relations between granting agencies and 
investigators in research. Patient and granting agency are in parallel 
relation. Payment is, accordingly, critical, and not extraneous, as Levine 
suggests (1975b). It helps define and separate agent from client in both 
treatment and research, in addition to filling other functions to be dis- 
cussed in Sections IV and V. 

Research subjects do not enter this realm of discussion. They 
play yet a different role. This role is evident if one first summarizes 
profession-agent roles in treatment and research. 

A. Treatment: la. Professional is agent of patient 

lb. Patient is client of professional 

2. Professional agent is paid by client patient. 

B. Research: la. Professional is agent of grantor. 

lb. Grantor is client of professional 

a. Professional agent is paid by client grantor. 

14-30 



C. Research Subject: la. Subject is agent of professional. 

lb. Professional is client of subject 
2. Subject agent is paid by client. professional . 
Vis-a-vis the subject, the professional is in a reversed position from 
either of the two preceding ones. Since the professional is an agent of the 
granting agency, the subject by extension is also. The subject can be 
described as being in a "line position" rather than in one of continual 
interaction with the professional or the granting agency. 

A fourth relation of interest can now be considered. This is the 
situation where research is conducted in the context of treatment. 

D. Research-Treatment: la. Professional is agent of patient (A-l ) 

lb. Professional is client of subjeot(C-l) 
Since the subject is also the patient, the same person is both client 
and agent. If the practitioner is also the investigator, this confounding 
holds on this side, as well. If practitioner and investigator are separate 
in person, both may be similar in role, since they are agents of the same 
client institution (hospital or university) which pays their salaries. Unless 
the relations are made explicit, and steps are taken to separate the functions 
(some of which will be discussed), there will be problems in a variety of 
areas, including coercion and consent (see Gray, 1975, for some of the con- 
tamination) . 

Since the investigator pays the subject and the patient pays the pro- 
fessional, when investigator and professional are the same, and subject and 
patient are the same, each should both pay and be paid. Indeed, the cancella- 
tion or lowering of patient fees in many clinical-research units supports this 
statement. 
Means-ends relations 

It is frequently asserted that since the research subject lacks whatever 

14-31 



protection the patient gets from the mutuality of patient-practitioner 
outcomes, the subject requires special protection. The particular jeopardy 
in this case is that the subject may be used as a means to obtain the 
investigator's end , namely, general knowledge. This may not only be un- 
helpful to the subject, it may be harmful. Where research is conducted in 
the context of treatment, it is at best simply extraneous to the outcome of 
treatment, and at worst, in opposition to it. 

In research, human subjects are considered specially subject to abuse 
since a variety of social consequences are contingent upon the investigator's 
contribution to knowledge. Dependent on publication are prestige, promotion, 
income, research funds. These outcomes for the professional can not be 
characterized by the mutuality of patient-practitioner outcomes which 
characterize treatment. Nor are they even congruent with the payment or 
course grade used to maintain subject participation. The subject is there- 
fore liable to abuse -- the consequences cited are strong ones and are not 
shared by the subject. 

In treatment, however, similar consequences are also likely to hold. 
Presumably, dependent on the practitioner's success in treatment are such 
consequences as prestige, promotion, income, and access to facilities. 
These outcomes are not characrerized by mutuality of patient-practitioner- 
social outcome. Such divergence in outcomes between professional and client 
was the occasion for the anguished cries of Linus in the Peanuts comic 
strip series when he discovered that his teacher was getting paid; he was 
broken-hearted to discover she was not governed by his learning. (The 
consequences for students in elementary school systems for which the governing 
outcomes are other than student progress are more disastrous.) The 
dimensions along which critical differences may lie, when one views the 
systems as systems, are in the different socially-defined contingencies pre- 
viously discussed, which distinguish treatment from research. The ethical 

14-32 



issues, in part, reside in the fact that the outcomes determined by the 
social systems in the two cases do not consider research subjects. The 
outcomes are, in one case, treated patients, educated students, trained 
technicians, and so on, and, in the other case, are treated and better 
organized systems of knowledge. Where there is abuse, it resides partly in 
the specific procedures used by particular systems, and partly in the rela- 
tions which research and treatment share with a host of other social 
institutions, and which will be discussed in Section IV, and not simply in 
the use of the subject as a means, since the patient may also be used. in 
this manner. 

IV. ABUSE OF POWER: COERCION AND CONSENT 
A variety of interpersonal relations including those of research and 
treatment may be described as power relations. The common contingencies 
related to this common descriptive term make possible the abuse of power 
they share. The issue of consent is addressed, in part, to such abuse in 
the context of coercion. The present section will consider coercion as 
it applies to the abuse of power and to consent. Section V, which follows, 
will consider informed consent in the context of contractual relations. 

Ethical issues are raised when power is abused. Interpersonal power 
relations may be found not only for investigators and their subjects, and 
doctors and their patients, but for governors and governed, officers and 
enlisted men, employers and employees, teachers and students, ward committee- 
men and appointees, husbands and wives, parents and children, to mention 
but a few. In each of these, power flows both ways, but the alternating 
powers, unlike alternating currents, differ in topography. The focus here 
will be on the first party, who may be said to be the "exclusive vendor" 
or distributor of the occasions and consequences which critically bracket 
soci a 1 ly- relevant behavior of the second party and may thereby control 
it. In this model, the comparable control exerted by the second party is 

14-33 



trivial. Since control over exercise of the powers of the first party 
does not derive from consequences supplied by the second party, it would 
appear to be under other control . 

Ope model used to describe such other control is "self-control," which 
may (or may not) be "related to an ethical code. That such codes are 
addressed to the asymmetric power flow described is suggested by consider- 
ation of "the moral law as such [as being governed by] a transcendent 
motivation" (Jonas, 1969, p. 232; cf. Goldiamond, 1968). Stated otherwise, 
it transcends control by the consequences supplied by the second party. 
Violating the>code is immoral or unethical and censure is applied by peers, 
that is, by those with parallel dispensation powers. 

The appropriate exercise of these powers may be considered to be a trust 
as defined by an explicit social fiduciary model, whereby kings, officers, 
employers, bankers, and husbands exercise their powers for protection and 
benefit of their wards (not only did the French general address his enlisted 
men as "mes enfants", but the Russian enlisted man addressed his commander 
as "Otyets", i.e., Father). Fulfillment of a trust is involved. Hence 
fiduciary (L. fidere , to trust). 

Needless to say, when the behaviors by which one party controls the 
behaviors of a second are not controlled by the second, and the first party 
is then considered to be under self-control or control by a code of ethics, 
the underlying assumption is that the first party's behavior is under some 
form of control. The necessity of internalizing the control, in the form 
of ethical adherence to a trust, derives from dissatisfaction with an ex- 
planation of control by a subordinate. However, the control may derive from 
a superordinate system which establishes and maintains the institutionalized 
relation between both parties, both of whom are therefore its agents. The 
social behavior of establishing and developing institutionalized trust 

14-34 



contingencies, like the support given the treatment, research, and legal 
institutions, is maintained by the outcomes the system gets when it provides 
such support. As in the case of the use of a term as difficult to define as 
intent, the problem to which a term as difficult to define as internalized 
adherence is addressed may be resolved by consideration of social contingencies. 
That they bear on an important social problem is indicated by consideration of 
at least one form of abuse of power. 

Such a case of abuse of power is defined when a member of the first 
party makes the social contingency (which governs the institutionalized 
relation) contingent on behaviors by the other which are outside the 

- 

social contingency, or applies the social contingency in other ways to 
get such behaviors. The David and Bathsheba episode is an early instance 
and provided the occasion for an explicit moral sermon. In a more modern 
vein, Peters, in Ethic s and Education , notes that "It is one thing for a 
university teacher to have an affair with his colleague's wife, but it is 
quite another thing for him to seduce one of his students" (1967, p. 210) 
The latter case permits an abuse interpretation: grades and prestige, 
socially approved to govern academic compliance, are made contingent on 
a different pattern of compliance. Thereby, it will be noted, society 
is not obtaining the occasion-consequence reversal which reinforces social 
support of universities: the untrained student has not become (academically) 
trained thereby. The teacher, accordingly, may be jeopardizing social 
support of universities. His university-supported peers may therefore suffer 
and may then censure him in some way. And the social system is frustrated 
(nonreinforced). He has "hurt Iv; profession" by his "antisocial behavior." 
These terms approximate the relevant terms in the social contingency. He 
"has violated his trust" refers to the fiduciary model. His "unethical 
abuse of power" refers to the asymmetrical power model. All derive from 

14-35 



the 500101 contingencies discussed. 

An interpersonal in which power derives from coercion is fertile 
ground for unethical ahi ince it permits easy control of behaviors out- 
side the contingency. 3 patient under tremendous distress which can 
be alleviated only by an t urgency treatment is subject to abuse by the sole 
dispenser Of that treatment. The dispenser can make dispensation contingent 
on a variety of req nts -- including consent for research as well as for 
a variety of treatmr he validity of consent obtained under such con- 
ditions 5 no matter informed the consent was, might be questioned. 
It might be argued that the procedures represented a flagrant abuse of power, 
and that the consent was spurious. It was obtained under coercion and was 
not freely given. The person was not in a position to consent. 

It is evident that in order to consider the validity of any type of 
conserit, we must first examine freedom and the coercion assumed to negate it. 
Contingencies of freedom and coercion ^_ 

Freedom will be defined in terms of the genuine choices available. Choice 
will be defined by degrees of freedom (df ) , a scientific term which will be 
used here to define the number of variables in a system whose values have 
to be Specified to determine the system. The volume of a cube is given by 
V=lwh, and given any three values, the fourth is determined (vlw to determine, 
h, Vwh to determine 1, and so on). Thus, df = 3, as it is to specify the 
coordinates of a point in 3 dimensional space. Our concern is with alternative 
behaviors , and we shall use derision theory as our model. Here, at least 
two wel 1 - defined sets of behavior are required (for example, being at home 
or at work are well-defined alternatives, but being at home or elsewhere 
introduces the poorly-defined set of elsewhere, which can include a moon and 
Jupiter), and the sets are related by the equation a + b = 1.00. Since the 
value of either then determines the value of the other, df = 1 . Where 

14-36 



a+b+c+d+e = 1.00, df = 4. There is a greater dej of choice, that is, 
there are more degrees of freedom. The df_ term is a useful one. It not 
only suggests that freedom is a matter of degrees, but also implies that 
coercion (to be defined presently) is also a matter iegrees. 

The parallel between intuitive notions of iom and the df usages 
is suggested by the fact that when the only work ava lable is in a mine, 
and otherwise the person goes hungry, th.eji. work in a mine may not be 
considered a matter of free choice and, indeed, union experience has taught 
that miners are then more vulnerable to abuse tha ey are at other times. 
With regard to work as the referent, since the* > work alternatives, 
df = 0. There are no degrees of freedom. This ac rds with the common 
expression. If there is a choice between mine, i Factory, or farm, then 
there is greater freedom, workers can feel "mc ident," and abuse 
is less likely. Here, df = 3. Freedom, as def ituitively or by values 
of df, is greater. 

Freedom is related to coercion in the foi ler. To the extent 
that a critical consequence (to be defined) is < b solely on a 
class of activities, then dc_, or degree of coe* on , is Inversely related (the 
term is used figuratively, rather than exactly) to Assuming temporarily 
that survival is such a critical consequence, >ne works in the mines 
or starves, coercion is maximal, since the maxir of d£ will be given 
when df=0. Where there was a choice between mini 1, factory, and farm, 
coercion was less since df had a higher value, I for ;he set of unskilled 
labors represented and starvation, there is o md the complaint of 
the uneducated that their freedom of choice is coi ned to jobs undesired by 
otners, becomes understandable. At any point 3 c>i course, the set of all 
possible tasks as opposed to survival can be coo coerced. Accordingly, 
the issue is never coercion versus no coercion sit dc = 1.00 (roughly. 
That is, one defines the other, and they are code The issue is the 

14-37 



amount and type of coercion we are willing to accept, and the protections 
against abuse we. set up. These should be defined. 

It was noted at the beginning of this section that choices had to be 
genuine. Genuineness relates to contingency repertoires. Someone with a 
high school education who scans the want-ads, has no choices when all 
openings require a college education. He does not have a choice between 
working as a miner or as a physician when there are openings in both fields. 
Here, df = because of the behavioral repertoire. Where job availability 
is not announced, or is circulated in channels not available to the seeker, 
or in a language the seeker cannot read, the existence of the appropriate 
repertoires is irrelevant, and df = because of the o pportunity component 
of the contingency repertoire. Further, there is experimental evidence that 
given occasions which are in the repertoire, given behaviors in the repertoire, 
and given potent consequences, the individual may persist in behaviors which 
result in loss of consequences, or may switch to those which rapidly produce 
them, depending on the manner in which the consequences were previously 
contingent on behavior (Weiner, 1972). Finally, the consequences enter, 
as when the type of food available is forbidden by a powerful religious 
code. Failure to distinguish genuine choice from simple availability of 
alternatives, no matter how well their availability is made known in an 
informed consent procedure, is reminiscent of Anatole France's statement on 
the impartiality of the law which "in its majestic equality forbids the 
rich as well as the poor to sleep on the bridges, to beg on the streets, and 
to steal bread" ( Le Lys Rouge , Chapter 7). 

Some consequences are at certain times more critical than others, depend- 
ing on a variety of conditions whose investigation is being pursued in the 



14-38 



laboratory. In one branch of such research, the organism may be offered a 
choice between two consequences, with response costs and other variables 
held equal. The extent to which one is valued more than the other can not 
only be measured but can be manipulated experimentally. One method is 
through deprivation, often referred to as need, or drive. Organisms at full 
body weight may prefer the opportunity to exercise over the opportunity to 
eat, but if they are deprived of food, the order of preference may be 
reversed. Other procedures may be utilized by the investigator, and all of 
these will be subsumed under the general term of conditions which make a con- 
sequence critical, that is, one which is preferred in all choice situations. 

Coercion accordingly may be defined as most severe when there are no 
genuine choices (df = 0), and the consequences contingent on behavior are 
critical. Coercion obviously relates to consent, since to the extent that 
coercion is involved, giving consent may simply be one more behavior added to 
the packet required to obtain the critical consequences. Where indignities 
are required, consent may simply become another indignity required to get 
the critical consequence or to avoid its absence, to state it in terms of 
negative rather than positive reinforcement. (For fuller discussion of . 
coercion under negative reinforcement, see Goldiamond, 1974, and for both 
negative and positive reinforcement, see Goldiamond, 1975a, b.) 

Two types of institutional coercion will be distinguished. In the first, 
the institution which delivers a critical consequence has set up the very 
conditions which make the consequence critical. In the second, the institution 
which delivers a critical consequence has not made it so. It is merely capita- 
lizing, so to speak, on an opportunity provided by a state of nature (actual 
or manmade). I shall designate these as Institutionally Instigated Coercion 
(IIC) and Institutionally Opportune Coercion (IOC). They will be considered 

separately. 

14-39 



Institutionally Instigated Coercion. A familiar research example with 
a nonhuman subject is the conventional operant pigeon experiment. Here, the 
experimenter (or the assistant agent) deprives the pigeon of food and brings 
him down to 65-70% of normal body weight. The investigator then makes access 
to food contingent on required patterns of behavior. By careful programing 
of these patterns, the occasioning stimuli, or both, it has been possible 
to establish extremely complex patterns of behavior and discrimination, 
almost without error. In technical jargon, delivery of food serves to 
reinforce the response required to make it available; it is the experimenters 
who have so arranged it that delivery of food serves as a reinforcing stimulus. 
This they have done through prior deprivation of the organism. They need not 
deprive the organism to achieve this effect. They may simply provide a few 
doses of heroin to an animal with an indwelling catheter. Yet other con- 
ditions may be manipulated. 

If deprived pigeons could consent, and were required to do so, before 
undertaking the training program which is their only means of obtaining food, 
such consent could be considered as having been obtained under severe 
coercion, rendered all the more severe by the fact that it was the experimental 
system itself which made potent the reinforcer it provides. To, say, a 
four-link chain required to make food available, for example, pull a wire, 
turn a counterclockwise circle, press a pedal which illuminates a disk, and 
peck that disk 15 times, then get food, a sixth and fifth link would then be 
added: intelligently discuss your options, then sign consent to participate, 
then pull a wire, turn a counterclockwise circle, press a pedal which 
illuminates a disk, and peck that disk 15 times, and get the food, blessed 
food. The coercion would not be reduced; it might even be exacerbated. 

Consider the case of human inmates of a penitentiary. If they partici- 
pate in a particular biomedical research project, such cooperation, by 

14-40 



demonstrating to the parole board the "acquisition of prosocial attitudes", 
renders them eligible for earlier parole. Stated otherwise, restoration 
of liberty or earlier release from incarceration (negative reinforcement) 
is contingent on an institutionally-provided opportunity to participate 
as a subject. The bicentennial notwithstanding, we do not need a Patrick 
Henry to remind us how critical a consequence liberty can be. The coercion 
is made all the more severe by the fact that the ^jery penal system which 
makes the delivery of liberty a reinforcer is part of the same judicial -penal 
system which deprives the inmates of liberty. The analogy with the pigeon 
is almost a homology, and the meaningfulness of any consent obtained under 
these conditions would be questioned. (Conditions under which prison 
research does not fall into this category will be considered shortly). The 
same strictures hold even if the prisoners are offered their choices of reha- 
bilitative programs, if each is linked to earlier parole. These then become 
elements in a coerced set. 

In one form of "brain-washing" the person is deprived of the usual 
social support through isolation by physical or pharmacological restraints, 
or through isolation from the hitherto supporting community by a special 
communal arrangement. Social support by the new group is then made contin- 
gent on individual behaviors which meet its requirements. The most effective 
behavioral requirements are those behaviors whereby the person, by assaulting 
the sensibilities of the original referent group, is further isolated from 
that group by his or her own behavior, making the support of the new group all 
the more critical. The parent who makes a child dependent is a clinical 
example. 

What is probably the starkest case of institutionally-instigated-coercion 
is the use of torture to obtain evidence. Relief from pain is made contingent 
on behavior which meets the system's requirements. It is the system which 

14-41 



supplies the painful stimuli which make relief from it a potent reinforcer. 
No civilized court would accept consent obtained under such means. Their 
equation with coercion makes clear the contingencies involved, which are 
often otherwise obscured by rehabilitative or other idealistic statements. 

Continuing on the same stark note, we routinely question the morality 
of those who create shortages and then profit from the delivery they mono- 
polize. 

In a less dramatic manner, the requirement of a department of psychology 
that each student in an introductory class participate as a subject in some 
experiment to obtain a passing grade belongs in this coercive category, 
to the extent that passing this course is critical to the student's academic 
program. However, the coercion is mitigated by its trivial nature, and the 
contribution of the experiments is typically in accord. 

(In a possibly facetious tone, the statement that "the lawyers" have 
us in their clutches may reflect not only their inescapability for us, but 
the existence of some overlap between the legal system which provides relief 
and the system which sets up the conditions which make its delivery a potent 
consequence. [The tax lawyers who write rules which only tax lawyers can 
decipher seem to be a case in point but, in actuality, social and political 
considerations often govern the rules.] The suggestion that legal practice 
be reviewed by committees composed of representatives of other interest 
groups may reflect not only retaliatory pique against legal advocates of 
"consumer" groups such as patients, prisoners, and students, but may also 
reflect the regulator-regulated issue raised by expertise which was noted 
earlier, as well as other professional issues. There is, after all, a 
legal profession which provides services to clients through socially-supported 
systems. It would be surprising if some of the issues raised in our discussion 

14-42 



of treatment and research did not apply here, as well. There is legal 
research as well as legal service delivery.) 

In all events, consent to participate in some activity, where the 
consequence contingent on participation was made critical by the consequence- 
delivery system, sho.uld be considered as having been obtained under coercion. 
This does not automatically exclude such consent or such activities from the 
pale since, as was noted earlier, the issue is not freedom from coercion, 
but rather the degrees and type of coercion we tolerate, and what safeguards 
against abuse these require. It should also be noted that it is the 
peculiar nature of the contingencies described which designate the activities 
and consent as coerced. The same activities and consent can be governed by 
other contingencies, which are not institutionally coerced. Given such con- 
tingencies, and where the activities are socially and personally beneficial, 
conditions appropriate to their support might be considered. To label an 
institution as coercive and therefore to assume that all related activities 
are coerced, is akin to certain characterological descriptions of individuals 
or classes of individuals which then subsume all individuals and all behaviors, 
Both ignore the different contingencies which govern the different and 
varying behaviors of any complex social institution or, for that matter, 
any complex social individual. 

Institutionally opportune coercion . There are situations in which 
the system which makes critical consequences contingent on institutionally- 
defined behavior has not produced the conditions which make these consequences 
critical. The "helping professions", of which medicine is a prime example, 
belong in this category (iatrogenic disease is an exception, but is con- 
si Jered an undesirable). Where Jj_ = 0, and the consequences are critical, 
coercion is still defined. It is not lessened by the fact that it was not 
institutionally instigated, nor is it lessened by its social prevalence, 

14-43 



inevitability, or desirability. The coercion is exacerbated when the 
institutions set up to treat the problem are operating under a "legally 
granted monopoly" over "a captive audience" (Freund, 1969b, p. 315). In 
effect, a critical consequence is not only solely contingent on submission 
to a particular form of treatment, but in addition, that form of treatment 
is provided solely by a system with monopoly control over its dispensation. 
The coercion possibly provides the system with an opportunity for socially 
appropriate practice or for abuse, which opportunity is not as generally 
available outside it. Accordingly, any consent obtained under such con- 
ditions requires careful examination. 

In the next few sections, I shall consider some possible arrangements 
whereby consent may be considered as possibly meaningful, when the person's 
entry into the system was coerced, whether coercion was institutionally 
instigated or institutionally opportune. Where these require different con- 
sideration, this will be noted. Three major arrangements will be noted, 
separating critical consequences from the activities, converting mutuality 
of outcomes to mutuality of conti ngencies, and noncoerced participation 
in programs specific to coercive systems. 
Separation of critical consequences and activities 

In a prison situation, when earlier parole is independent of whether or 
not an inmate participates in a program, then consent to participate in that 
program is not related to the release which the penal-judicial system made 
critical. If a church provides food during a famine, whether or not the 
person attends church, then it is clearly not capitalizing on this opportunity. 
Similarly, if the same treatment is available whether or not the person 
consents to serve as a research subject, then the situation is similar to 
the church arrangement. Separation of critical consequences and activity 



14-44 



simply removes this form of coercion. It does not, however, automatically 
instate other requirements to make consent meaningful. These will be con- 
sidered later. 

If making a critical consequence such as treatment contingent on research 
participation raises questions of appropriateness, it is partly because 
research is considered extraneous to the occasion-consequence reversal which 
characterizes treatment, and partly because of social values attached to 
relief of distress, among others. These considerations would also hold for 
making treatment contingent on ability to pay. It is highly likely that the 
United States will soon join other advanced nations which have eliminated 
this requirement. However, in the meantime, an ethical and social policy 
problem is posed by hospitals which make reduced payment or no payment con- 
tingent on serving as a research subject. It was noted earlier that this 
meets the exchange system logic of patient-pay, subject-paid, research 
patient-pay-paid, therefore fees cancelled. However the goodness of its 
fit to this model, providing free services in return for research partici- 
pation poses questions about the ethical fit. Where treatment is contingent 
on payment, the treatment consequence is critical, and the type of treatment 
offered is not genuinely (as defined earlier in terms of contingency 
repertoires) available elsewhere, the payment is coerced. That it is a social 
necessity is beside the point -- it is still coerced. For someone who lacks 

the financial resources (repertoire), making service as a research subject 

i '■■■ ■ 
a substitute for payment, substitutes research service for coerced payment 

in the coercion arrangement described. It must then be recognized that 

since research is thereby coerced, it is open to abuse, and consant must 

be carefully examined. Few commentators have been sensitive to this 

issue, but Eisenberg is on target when he doubts "that we will find a way of 

14-45 



distributing risk across all segments of society until we have a national 
health service for all citizens" (Eisenberg, 1975, p. 97). Under such 
arrangements, enrollment in a research-treatment program would be governed 
by considerations other than research substitution for coerced payment. 

Payment also enters into prison research (or special treatment programs). 
Where early parole and other institutionally-instigated critical consequences 
are not made the consequences contingent on research-treatment participation, 
this form of coercion is removed. Money, of course, is an important con- 
sequence though not necessarily a critical one for people who are otherwise 
fed, sheltered, and clothed. To the extent that it approaches being critical 
in a situation (as judged by its selection above other consequences), and to 
the extent that df approaches 0, the required activity approaches coercion. 
Critical nature and df will be assessed separately. 

With regard to critical nature, or uses of money to an inmate, it 
should be noted that the penal system deprives an inmate not only of liberty, 
but also of other amenities available in the world outside. Accordingly, 
institutionally-instigated coercion is defined not only when the system 
makes liberty contingent on some behavior, but also when it makes the other 
amenities of which it has deprived the inmate contingent on behavior. Where 
money buys freedom, it is evident that its payment has been coerced, and 
the behaviors upon which the wherewithal! to pay is contingent are also 
coerced. By the same logic, such coercion also enters into payments for 
amenities of which the prison system has deprived the inmate, and into the 
research/work programs which produce such payment. Before such programs 
are hastily condemned, an important qualification raised earlier should be 
reiterated. This is that coercion is not absolute, but there are degrees of 
coercion as well as of freedom. As was then noted, when work is the issue, 
availability of work in the mines, mills, factories, and farms is described 

14-46 



by df = 3. However, given the set of menial work (mines, mills, factories, 
farms) and a starvation alternative to that set, df = 0, and menial work is 
coerced. This can be extended to "higher" levels ad infinitum , lending 
support to Ogden Nash's verse, "I could live my life in ease and insouciance / 
were it not for making a living, which is rather a nouciance." This form of 
coercion occurs in the world outside and is acceptable — and, indeed, is necessary 
there (exceptions such as inherited wealth exist, of course). The principle 
of concordance with such outside facts of life may then be extended to define 
an acceptable form of work-coercion in the institution, as well. The general 
rule involved would take a form such as: to the extent that the institutional 
work programs follow the work-requirements of inmates (or people with their 
skills in legally accepted work) in their usual world, institutional work- 
requirements provide an acceptable form of coercion. Exceptions derive, of 
course, from criminal work, e.g., the system would have to provide a forger 
with other work arrangements. Similarly, inmates who had never worked might 
be given work concordant with that available for people with skills and 
experience similar to theirs, or might get necessary training. Along these 
lines, it should be noted that at least one European prison provides for 
daily medical practice outside the walls for physicians serving their terms, and 
similarly provides for construction and factory work, etc., for skilled and 
unskilled workmen. Earnings on the outside are at the going rates there. 
In these institutions, the inmates also pay, from their earnings, for their 
room and board, as well as the extra costs which their incarceration incurs. 
Such institutions are special institutions with special programs prior to 
such arrangements, and during them. It should be noted that the world 
outside provides payments for research subjects, and in some cases, such 
payments are competitive with those for work. (Some nutritional research 
programs, for example, have provided salaries for college students during their 

14-47 



summer breaks.) To deprive inmates of such work/research possibilities has 
the effect, at the very least, of depriving them of options concordant with 
those holding outside. Other effects have been cited by advocates of penal 
reform or abolition, and will not be discussed here. 

The value of n in df = n is, of course, resolved by application of 
the foregoing concordance principle. As many options might be available as 
are given by the socially-accepted skills of the inmates, the positions 
available and the exigencies of the institution. And there is no reason to 
exclude the option of serving as a research subject, providing the payment, 
conditions, and protection are concordant with those provided for a volunteer 
outside for whom other options are available. 

This approach to research participation might also enter into institutions 
whose coercive control is opportune, rather than institutionally-instigated. 
Stated otherwise, arrangements for research participation of patients under- 
going treatment might be concordant with the arrangements for research parti- 
cipation of paid normal subjects of the type described. Where the research 
is related to treatment, and the problem is a rare one, the subject/patient 
is then not a routine research employee, but one with special and hard-to-find 
qualifications., Arrangements should be commensurate and concordant with those pro- 
vided for skilled employees outside. Where the problem is more common, subject/ 
patients should be easier to find, and the situation is more competitive. 
Even under such conditions, as anyone who has conducted long-term research 
knows, the investment in the research patient or research pigeon is con- 
siderable, and the concordant arrangements discussed earlier would also hold 
here. It is assumed, of course, that for the patient, research is an option 
ani not a requirement for treatment. Otherwise, institutionally-opportune 
coercion holds, and the research-patient may be in greater jeopardy than a 
prisoner with other-than-research options. 

14-48 



The issue of social versus individual needs is, I believe, inappropriate 
to this context. Edsall (1969) argues, for example, that individual treat- 
ment needs must occasionally be subordinate to social research needs, 
citing the drafting of young men as soldiers (pp. 472-3). Indeed, Beecher 
asserts that "parents have the obligation to inculcate into their children 
attitudes of unselfish service. This can be extended to include participation 
in research for the public welfare if judged important and there is no dis- 
cernible risk" (1969, p. 282). The children of mothers on diethylstilbesterole 
(DES) some twenty years ago might judge that "no discernible risk" to have been 
otherwise. The war situation is not analogous. The possibility of death 
and disfigurement is well-publicized. Such outcomes for the enemy accord 
with social contingencies, and the same fate for the local army accords with 
social contingencies of the enemy. It might be said that the volunteer con- 
centrates on the social contingencies of his side, and the draftee concen- 
trates on those of the enemy -- hence the coercion applied to his recruit- 
ment. Any analogy to research, whether in a medical setting or in a prison 
is far-fetched. As Jonas notes: "No one has the right to choose martyrs 
for science" (1969, p. 222). 

Converting mutuality of outcomes to mutuality of 
contingencies 

In a treatment system, it is the individual's responses (behavioral or 
physiological or both) which provide the occasions and outcomes whose reversed 
relation ultimately supports the profession and its professionals. To the 
extent that the individual's behaviors are brought into the same contingencies 
which govern the professional's behaviors, the professional's task is 
simplified. This requires that both work toward the same goals, or be 
motivated by the same outcomes, or that their behaviors be governed by the 

14-49 



same consequences, to use three different descriptive systems. This 
holds for research as well as treatment. We shall consider treatment 
first, since such mutual outcomes are assumed to characterize treat- 
ment systems. Despite the mutuality of outcomes such systems are 
organized to deliver, the treatment-relevant behaviors of individuals 
and professionals are often also (or instead) governed by different con- 
sequences. These may frustrate one or the other or both. Further, the 
individuals and professionals may not be apprised of what the other is 
doing. They may not be apprised of the relation of outcomes to the require- 
ments of the other. Any of these may make informed consent meaningless. 
Accordingly, it may be worthwhile to examine how a system which is organized 
to deliver common outcomes might set up arrangements which facilitate such 
delivery, and under which arrangements informed consent might be meaningful. 
We might then see how these arrangements could be extended to a system in 
which it is assumed that common outcomes do not characterize individual and 
professional -- the subjects and investigators of research systems. 

Although treatment systems are characterized by " mutuality of outcomes' 
it was noted earlier that they are also characterized by " reciprocity of 
behaviors." The physician orders and prescribes, the patient obeys and 
follows; the teacher teaches and assigns, the student learns and follows; 
the trainer trains and provides experiences, the trainee learns and 
utilizes. Accordingly, although the culminating outcomes are mutual, the 
behaviors required are not. Further, the behaviors of one are the 
occasions-consequences of the other. The analysis suggests that regard- 
less of identity in culminating chain outcomes, the contingencies in the 
links of the chains are d ifferent in every component for professional and 
individual. Occasions, behaviors, consequences differ. For the individual's 
behaviors to be optimal ly governed by the same consequences as are those 

14-50 



of the professional then, not only must the individual's behaviors 
be governed by the same general outcomes as the professional, but the 
explicit occasions, behaviors, and consequences of the links in the chain 
must also be the same for both professional and individual. To make the 
contingencies the same suggests that it is only when individuals have access 
to the same data about themselves which the professional has that it becomes 
possible for these to come to govern their behavior , as they do govern the 
behavior of the professional. And in the difference between "come to 
govern" and "do govern" lies the professional training of the practitioner 
(The importance of past histories for a contingency analysis was noted 
earlier in the discussion on genuineness of choice as it relates to con- 
tingency repertoires. Among the major considerations was the "manner in 
which the consequences were previously contingent on behavior"). And I believe 
it might then be part of the professional's task to educate the individual. 
The education need net be of the kind or depth which produces a skilled 
professional. It might be one which simply supplies the individuals with 
the tools for analysis and change in the problem areas of treatment concern. 
The individuals are the experts in the data and conditions of their own 
lives. If they are taught where and how to look, they can supply data and 
suggest relations which professionals can use to advantage for the solution 
of the presenting individual problems. Such data are otherwise not 
available. And individuals can also begin to analyze their own responses 
and occasions of concern, and try to figure out what to do about them, 
trying this tack and that, even as professionals analyze the same responses 
and try out different approaches -- procedures which they and the common 
language confuse with experimentation. Professionals keep written records 
and are guided by them. The system suggested would require individuals to 
do likewise , and professionals would have access to their records in 

14-51 



concordance with the individual's access to professional records. 

It should be noted that as chronic problems increase in importance, 
ind as the influence of the environment is coming under increasing 
scrutiny, at least one system of treatment, namely, medicine, is turning 
increasingly to such individual self-management. Health delivery systems 
are trying to train individuals in self-examination (e.g., breast cancer) 
and self-monitoring (e.g., home sphygmomanometers), and physicians are 
eginning to substitute education and joint-decision making for the assump- 
;ion that if they fulfill their trust in a fiduciary relation with their 
atients, these wards should cooperate and meet their obligations of 
beisance and recovery. 

A treatment system which requires individuals to keep explicit records 
in concordance with staff records can readily be converted into a research 
system, as well. The extensive data which such records provide are, as 
was noted, otherwise not available. Tfiey provide information about 
responses of the individual under different conditions, and about the 
settings in which the problems occur which can be useful for research. 
Just as professionals often interpret the same data differently, the 
possibility of different interpretations of data from the same individual 
records may suggest itself to the individuals when they are required to 
interpret, or to individual and professional in their regular conferences. 
And just as in the course of professional conferences, the resolution may 
be to wait, to get more data, or to try this and try that. And it 
should also be noted that waiting (collecting more observations over time), 
or getting more data (running the same subject under more conditions), 
or trying this and that (manipulating different variables) are also means 
employed by experimental investigations for resolution of problems or 

14-52 



conflicts in explanatory systems. To the extent that the recording 
system which is supplied to the individuals, the interventions suggested 
for them to make, and other procedures are in concordance with those 
behaviors which enter into the definition of a research contingency, 
individual records can contribute to research. Is such research use of 
records and interventions separate from treatment use? If one views 
treatment in the context of self-management for prevention, melioration, 
or maintenance, then research use by the individual becomes necessary for 
treatment use by both individual and professional. Finding out about 
oneself, about "how I function," through distinguishing poor "explanations" 
from- better ones, can be quite important for self-management or for improved 
professional management. And the "context of justification" of the scientific 
method is an excellent means for distinguishing acceptable formulations. 
Just as the treatment professional educates in the formulations and pro- 
cedures of that area, the research professional educates in the formula- 
tions and procedures of that research area. In a research-treatment system 
of the kind described, the individuals may gain insights which are important 
for the practical resolution of their problems. The investigators may 
gain insights into those general functional relations whose resolution is 
important for the resolution of systematic problems in their disciplines. 
In such a research-treatment system, research and treatment go together 
because each is required for the other. Individual and professional are 
both "research and therapeutic allies" who share what intelligence the joint 
effort requires be shared, while having their own separate sources. 

This setting describes for research and treatment the "collegia! i ty" 
becween individual and professionals which Parsons (1969) sees as ideal, 
and which Mead (1969) reports as obtaining in field anthropology (at least 
in those projects in which she has been involved). 

14-53 



Where the treatment does not require research for its fulfillment, 
treatment can take place within the congruent-contingency system dis- 
cussed for treatment alone. Individual and professional are then 
"therapeutic allies" who share what intelligence about each other their 
joint effort requires, and reserve to themselves what is not required. 
For research alone, the congruent-contingency system would involve investi- 
gator and research subject. As "research allies" they would share and 
reserve corresponding intelligences. 

In certain treatment areas (clinical, educational, or training), the 
outcome-producing program is well -formulated, with each step having been 
validated experimentally. In programed instruction (p.i., cf Hendershot, 
1967, 1974), the title of the text gives the outcome. Each of the frames in 
the text resembles a mini-contingency. An instruction appears, the student 
responds, usually by writing in a blank provided, the appropriate answer 
is then available for comparison. Ifrthere is a response-answer correspon- 
dence, the student is then presented with the next frame, and so on. There- 
by, outcome repertoires are established which are far removed from those 
with which the student entered (The derivation from operant laboratory 
research is evident). The instruction which opens the frame, and the 
opportunity to move ahead (a consequence), contingent on adequacy of the 
student's response, may be considered as professional surrogates. They are 
always explicitly presented -- if the individual does not advance to the next 
step in "treatment", the reason is clear. (In a branching program, the student 
may be detoured to other steps, i.e. to differences in treatment before the 
main program is rejoined.) The students have access to their own perfor- 
mance and its adequacy at every step. (The steps are longer in the classroom- 
systems application known as p.s.i., where the instructions may be entire 
lessons, cf . , Sherman, 1974). Although individual and professional 

14-54 



are not colleagues, or therapeutic or research allies, the expl ici t 
presentation to the individuals of the same information about them which 
the professional has (albeit by a surrogate professional), which enters 
into collegiality, also holds here. In the form of p.i. known as computer- 
assisted instruction, this electronic surrogate-professional functions 
almost as freely as a professional, (cf. Markle, 1975). 

One implication of the quest for collegiality in the p.i. context 
should not be overlooked. The implication derives from the question: when 
are individuals and professionals colleagues in such programs, if ever? 
Students are- presented with detailed steps, hence are not allied with 
the professional in their choice of them. The question is answered through 
reference to the developmen t of the program. Here, there was opportunity 
for collegiality between program developer and individuals in the analysis 
of each step and its judgment as wheat or chaff. The developmental 
research is done in the context of treatment -- teaching, in this case. 
Here, there is room for considerable flexibility and trying this and that 
which, in a good program, is concordant with research behavior. Once the 
program is developed, it is simply available for application, and there can 
be several different programs which explicitly produce the same outcome in 
different ways. The parallel with clinical treatment is evident. The 
major implication is that collegiality may be necessary when the steps in 
the program (linear, or branching) have not been validated. Further, such 
development would require both treatment and research. And a corollary is 
that when the program is developed, it is still necessary to provide 
individuals access to the same data the professional gets. 

The foregoing arrangements are obviously limited in their applica- 
bility. Among other limitations, they assume extended interactions over 
time. In treatment, such interactions are found in chronic care, education, 

14-55 



or training, 'iney are also found in acute care when coupled with long- 
term recovery, or maintenance, or prevention programs. In research, 
extended interactions are found in laboratories which require extended 
experimental intervention, or where acute studies have long-term effects. 
Establishing arrangements of the type discussed is not an easy task. It 
requires careful and long-term contingency analysis which operant investi- 
gators and practitioners are familiar with, but in an area which is 
generally foreign to them, and whose required formulations have not been 
considered in the simpler operant arrangements studied thus far. 

Although such arrangements would seem to be of only limited applica- 
bility to acute care or acute research (those situations where inter- 
actions between individual and professional cover only a short span of 
time and are confined to a few episodes per patient-subject), they may 
suggest some principles which might be applied. This would hold especially 
if the episodes are considered as condensed interactions which follow the 
same rules as the more chronic ones. They occur too rapidly for the 
analysis which the more leisurely and more magnified chronic situation 
permits. 

Other settings and types of relations or problems or individuals 
may also suggest limitations. Nevertheless, the extent to which collegi- 
ality arrangements apply there might be considered. 

An example of one such research-treatment system is provided by our 
laboratory-clinic (Goldiamond, 1974). We have been developing and working 
with such an explicit congruent-contingencies system. We have thereby 
been requiring individuals to keep daily records of the problem-relevant 
contingencies of their lives, even as we require of ourselves. We have 
been trying to have them analyze these records, even as we would. The 
records are used by us for basic research in behavior analysis and 

14-56 



behavior change in the context of treatment. Most of our patients have 
come from the well-educated middle class, as befits a university clinic, 
but lately we have been doing research in heroin abuse and have found the 
recording system to be applicable for urban poor with little education. 
As an illustration of how collegiality arrangements of the type discussed 
can lead to application of professional analysis and intervention by 
patients for their own problems, I shall cite the report of an out- 
patient upon his return from vacation. He had had a history of hospitaliza- 
tion for schizophrenia (his brother was recently hospitalized for the 
same problem). During his vacation, his wife walked out on him, leaving 
him alone in the motel. "I found myself sitting in bed the whole morning, 
and staring at my rigid finger," he said. "So I asked myself: 'Now what 
would Dr. Goldiamond say was the reason I was doing this?' He'd ask what 
consequences would ensue. And I'd say: 'Hospitalization.' And he'd 
say: 'That's right! Just keep it up and they'll take you away.' And 
then he'd say: 'But what would you be getting there that you're not 
getting now?' And I'd say: 'I'll be taken care of!' And he'd say: 
'You're on target. But is there some way you can get this consequence 
without going to the hospital and having another hospitalization on your 
record? 1 And then I'd think a while and say: 'Hey! My sister. She's 
a motherly type, and she lives a hundred miles away.'" He reported that 
he dragged himself together, packed, and hitch-hiked to his sister who took 
him in with open arms. The education occurred in the process of analysis 
of several months of written records. 

Honcoerced oartici oation in programs specific 
to coercive systems . 

In a system using institutionally-instigated coercion, consent is 
suspect when it is obtained for participation in some program, research or 

14-57 



treatment, whose consequence is diminution of such coercion. Where there 
is institutionally-opportune coercion, the same precautions hold but, in 
this case as in the first, the social task is to define the amount and 
types of coercion we are willing to accept, and the protections against 
abuse we set up. 

As was noted earlier, one solution is to separate programs from 
coerced consequences. In a prison, for example, diminution of coercion 
would not be contingent on research or academic or training programs, but 
other consequences might be attached. The congruent contingencies of the 
preceding section might be considered in this connection. The con- 
tingencies for noncoerced programs (outcomes and subject matter) in IIC 
systems would tend not to be specific to those systems, but concordant 
with those of the world outside. 

There is, however, one type of program which is specific to the 
coercive system, rather than being concordant with the world outside, 
which night seriously be considered for both IIC and IOC systems. This 
is a program of research, treatment, or both whose maintaining outcome 
is nonrecidivism. Under appropriate precautions, such programs may be 
characterized by noncoercive mutuality of outcomes as well as by congruent 
contingencies for program- relevant behaviors of professionals and inmates/ 
patients/ students/ research subjects. 

In a prison system, a course of study which prisoners often readily 
enter into is how to avoid being sent up next time. The courses, of 
course, are informal and are taught by colleagues sub rosa . The non- 
recidivism at issue is defined by them as operationally as it is by any 
sociologist, namely, nonreturn. The social intent, or contingency, is 
in nonreturn reflecting nonrepeti tion of offense: the discharged pris- 
oner goes forth and sins no more. The contingencies governing the 

14-58 



inmates may be otherwise: how to get away with it. Differences between 
operational definitions and operant contingencies notwithstanding, the 
popularity of the courses and their prevalence commends them to our 
attention as indicative of voluntary enrollment. Returning to the operant 
contingency permits the following suppositions. Suppose we try to 
develop (research/treatment) a program in the o institution which trains 
complex repertoires and skills concordant with those on the outside. 
Suppose these would then provide consequences critical to the inmate. 
Suppose the skills are socially acceptable. Suppose enrollment in the 
program is not governed by consequences made critical by the institution, 
but by consequences concordant with those outside, as discussed earlier, 
and that enrollment here is one of several options available? 

In a clinical situation, an analogous program, applicable as well 
to the world outside, would be a prevention or nonremission program. 

In a mental hospital setting, Fairweather et al (1969), set up a 
research-treatment program whose subject/patients worked together in the 
institution to develop skills for each other which would maintain them 
in their own community-setting outside. A token economy was devised in 
conjunction with carefully articulated programs of increasing approach to 
such skills, in accord with p.i. Differences between these patients and 
controls with similar problems in socially-desired measures such as self- 
esteem while in the program and recidivism thereafter are striking. Keehn 
et al report related use of a token economy for alcoholics in a community 
of their own. 

Consideration of the specific procedures used and their rationales 
is beyond the scope of this discussion. The issue is raised only in terms 
of its relevance for consent, coercion, and social contingencies. Many 

14-59 



types of responses can be established within institutional settings 
involving IIC and IOC. The maintaining consequences are often increased 
convenience for the staff, or demonstration of lawfulness for the 
investigator. That programing procedures can be applied to the investi- 
gation, development, and treatment of nonrecidivism for a variety of 
socially important contingencies, suggests the possibility of noncoerced 
participation in programs which typically utilize coercion, since their 
outcomes are specific to the coercive systems involved. These programs 
provide consequences for the individuals, the professionals, and the 
social systems which are important to each. 

V. CONTRACTUAL RELATIONS 

The social fiduciary model (f.m. ) assumes inequality in powers. One 
party exercises its powers in the fulfillment of a trust for the protectior 
of its wards, the other party. An alternative model is the social con- 
tractual model (cm.) between two consenting parties assumed to be equally 
capable of consent. The powers are exercised in fulfullment of a future 
exchange for mutual benefit. What each party delivers the other in 
the exchange is explicitly stated. 

It has been customary for practitioners and investigators to regard 
themselves as functioning within a f.m., and attentive to the welfare of 
those entrusted to their care. And if these professionals are hurt by or 
are indignant over what they interpret as an unjustified mistrust, they 
need but reflect on the steady public erosion in acceptance of the social 
f.m. (as distinguished from legal f.m.), and the steady substitution of 
social cm. (as distinguished from legal or commercial cm.). The change 
is reflected in relations between governments and citizens (formerly 
governed, or rulers and subjects), employers and employees, and husbands 
and wives, to mention but a few. Indeed, it would be surprising if 

14-60 



treatment or research escaped this trend. The slogan "Sit back and let 
us do the driving" may sit well in advertisements for a bus company, but 
it is being treated as skeptically when the practitioner states it in one 
form or another (trust us to decide for you; we'll keep our own house in 
order) as when government officials make such statements about their 
operations. 

It is interesting to note that the Constitution in essence follows a 
model which tries to balance distrust of those in power with the necessities 
of effective exercise of power, and allows the federal government only 
those powers explicitly granted it. All nonspecified and residual powers 
are reserved to the (States and) people, the other socially contracting 
party. Elsewhere I have discussed the difficulties faced by mental illness 
professionals consequent on their substitution of a reversed model, in 
which the treatment system has all powers except those it grants its 
charges (Goldiamond, 1974). This mode] is contrary to the assumptions 
of the constitutional cm., and is much closer to f.m. assumptions. 

Each of the contracting parties is assumed to be equally capable of 
consent. My present concern will be with the equality relation. Capability 
will be considered in the next section. If there is to be equality, it 
might be reflected in equal specificity of the terms mutually agreed 
upon. However, contracts are often biased in specificity, imposing 
greater requirements for specificity upon one side rather than the other. 

A familiar example of a contract where the burden of specificity is 
upon the client (payer) is the apartment lease. Here the responsibilities 
of the tenant are detailed so explicitly that they must be printed in 
small type in paragraph after paragraph. Aside from description of the 
premises provided by the agent (payee), provision of heat, access and other 
agent responsibilities are stated in general terms, which are kept to a minimum. 

14-61 



On the other hand, the burden of specificity is upon the agent 
(payee) in the consent forms for patients to sign before admission to 
hospitals or for procedures within them. What the hospital or staff might 
or might not do, that is, its responsibilities are often spelled out in 
such explicit detail that they require paragraph after paragraph of small 
type. What is required of patients is minimally explicit, and quite 
general . 

While the burdens of detail imply a breakdown in trust-relations, 
differences in sidedness of the general -detail relations also imply the 
direction of whatever trust relation remains. In the hospital, the patients 
are to entrust the care of their persons to the professionals. For the 
apartment, the landlords are to entrust the care of their property to the 
tenants. However, patient-professional relations follow mainly from a f.m., 
whereas tenant-landlord relations follow mainly from a commercial cm. 
Accordingly, trust is involved in both-cases. Indeed, mutual obligations 
and responsibilities entered into the feudal f.m., even as faith and trust 
enter into commercial cm. But the fact that I trust the manufacturer from 
whom I purchase my refrigerator to have exerted reasonable standards and 
precautions in its manufacture (with legal sanctions contingent on their 
violation) puts our relations no more on a f.m. than the mutual obligations 
of feudalism (with sanctions contingent on violation) put relations between 
noble and serf on a commercial cm. Commercial cm. are compatible with 
assumptions of trust and one does require a f.m. for a trust relation. 
We are loyal to certain stores and products and suspicious of certain 
professionals. 

It is likely that the existence of elements of each model in the other 
derives from differences in social decision rules and other relations 
applied historically at different times, with the resultant present 

14-62 



situation representing different historical weaves. One outcome of the 
interaction of these weaves and changing modern conditions is that a 
fiduciary relation with which professionals felt comfortable and had worked 
from since the days of, say, Hippocrates, at least, is being interpreted 
as delegation of carte blanche powers to the professional. Accordingly, 
legal redress is being sought and other models are being applied. In 
this period of confusion, certain protections accorded to the individual 
by the social f.m. are being retained, while obligations upon the pro- 
fessional by the social cm. are being added. It is probably in this con- 
text that statements by professionals that patients have obligations, too, 
are to be considered. Viewed in cm. terms, a contract between an institu- 
tion and individuals should not only spell out in detail what its obli- 
gations are (as is the present case), but would also spell out in equal 
details what the patient/subject obligations are (as is not the present 
case). If the field is moving to the spcial cm., then the f.m. obliga- 
tions should be changed to the explicit exchanges required by social cm. 
Otherwise, both treatment and research delivery may suffer. Possibly 
this is necessary to preserve or produce a balance. Possibly the present 
division is considered as one-sidedly favoring the professional. Perhaps 
advancing technology is producing lop-sidedness in this direction, unless 
correctives are instituted. However necessary such corrections, if 
treatment and research delivery suffer, so too, will present and future 
patients, and the social system. 

In all events, we might start making explicit what is involved and 
required. If a f.m. is to be retained, I am suggesting that this decision 
be treated as a decision, rather than as an article of faith or precedent. 
This would involve comparison of this option (retain f.m.) with at least 
one well-defined alternative (substitute cm.), in addition to the other 

14-63 



cal> iii.il rt^quiieiiicnui ui suum aiiaijiois, i in- i uu i ny LUiii aiiu uenct i (.3 ui 

each, and the decision rule we might follow. 

Service and outcome contracts. Two types of social cm. will be 
noted, a time/effort (service) cm., and a specific outcome cm. 

In the time/effort cm. , the professional guarantees time and effort 
and the client pays for these. In return for payment, the practitioners 
guarantee neither recovery nor cure (occasion-outcome reversal) but simply 
that they will put in the time and skills necessary and paid for. The 
physician, teacher, and automobile mechanic are paid for time/effort by 
their patient, student, customer clients. This type of cm. also applies 
to research grants. Here the granting agency pays and in return the 
university guarantees neither results nor contributions (occasion-outcome 
reversal), but simply that it will guarantee the time and skills of its 
principal investigator. 

The time/effort cm. of a research grant might serve as a model with 
which treatment cm. are to be concordant. The client granting-agency, 
as was noted earlier, keeps track records of the accomplishments and 
previous awards of its principal investigator. The university and investi- 
gator keep similar records. The p.i. specifies procedures and rationale 
in detail, and the agency examines these with equal attention. 

The patient, of course, is the client in clinical treatment. Lest 
it seem far-fetched to suggest that clients keep track-records of practi- 
tioners, at least one consumer group is now doing so in at least one 
branch of clinical treatment. Track records of different educational- 
treatment institutions for client-student use are available to potential 
students and, in some case, are prepared by professional educational 
associations themselves. Peer evaluation is thus made available to 
clients in education, as it is in grant review (the client is the agency), 
and this is not considered unprofessional. 

14-64 



The time/effort type of cm. is generally used when outcomes are 
uncertain, or procedures have not been expressly validated. This is what 
research is about, of course, and this may underlie the confusion of 
experimentation with practice by practitioners. Where outcomes are more 
certain, where validated procedures are used, a different type of relation 
holds. 

In the specific-outcome cm., the professionals guarantee the delivery 
of outcomes or products which will meet explicit specifications. They 
are paid in return for this guarantee or performance. The research contract 
belongs in this category. In the educational treatment system, perfor- 
mance contracting has been tried, with mixed results. Here, the educa- 
tional system is paid contingent on stipulated levels of performance by 
its students, following training. Since specific-outcome cm. assume 
validated procedures, the procedures and delivery can be cost-accounted, 
and fees can be fixed. In health care^ ; the "Blues" and other third-party 
payers often provide fixed-fee imbursement for specified procedures; this 
would appear to assume validation and certainty. It is of interest that 
in the field of psychotherapy, behavior modification is moving in such a 
direction. Its practitioners speak of imposing upon themselves requirements 
which generally do not characterize other branches of psychotherapy nor, for 
that matter, most other branches of treatment. These generally follow 
the grant model. It is of further interest that behavior modification 
contracts make explicit not only what the therapist does at each step, 
but what the client is required to do. Although most such contracts and 
records are explicit in terms of the chain-transactions of each of the 
parties in the interactions, with regard to payment, the fees at present 
are mostly for time and services. Accordingly, in most cases, the programs 
belong in the grant category, that is, the first one mentioned. 

14-65 



Contracts in which the agency is paid for time/effort ("professional 
services rendered") or for outcomes delivered have differing costs and 
benefits which are beyond the scope of the discussion (one of the major 
accusations against time/effort cm. is that the delivery system, being 
reinforced for these, may maximize such reinforcement by increasing time 
rather than improving effort, which can better be accomplished through 
outcome contingent cm. On the other hand, the system may then select 
its treatments in terms of payment, rather than actual service). How- 
ever, the fact that the outcome cm. ("research contract") seems appropriate 
where the "state of the environment" is known, and the time/effort cm. 
("research grant") where it is unknown, suggests the possibility of a 
decision model with shifting strategy criteria, depending on states of 
knowledge, outcomes, and decision rule to be followed. 
Informed consent 



The social contractual model assumes that two consenting parties 
are equally capable of consent, and have given it. Fulfillment of the 
contract is not binding on the party which is deficient in either. 

Capability may be considered in terms of much of the preceding dis- 
cussions, which will be summarized for this purpose. Degrees of coercion 
are defined by the number of genuine choices between alternative options, 
the critical nature of the consequences which govern the behaviors 
involved, and the conditions by which the consequences are made critical. 

Degrees of coercion are inversely related to degrees of freedom, 
defined in terms of alternative well -defined sets of behaviors. Minimally, 
df - 1 , that is, there are two equally available options. 

Genuine choices involve such options when contingency repertoires 
are equal. Equality of contingency repertoires requires equally available 
opportunities or occasions, equally available patterns of behavior, equally 

14-66 



potent consequences and, since these are contingency repertoires and 
repertoires require establishment over time, equally functional contingency 
histories. 

Critical consequences are those which are generally potent over others 
when made contingent on a particular individual's behavior, given certain 
broad sets of conditions. 

Where, for genuine choices, df = 0, and critical consequences are 
attached to the option(s) and the consequences have been made critical 
by the system which provides them , coercion is then defined for that 
option, and no consent is meaningful. Where df^l , and noncritical con- 
sequences are attached, consent is meaningful to the extent that it and 
the contingencies involved are concordant with those obtaining for similar 
options in the world outside. If research participation meets these con- 
ditions, it is acceptable. 

Where, for genuine choices, df = C^" and critical consequences are 
attached to the option(s), and the consequences were not made critical 
by the system which provides them, consent must be examined critically, 
unless other arrangements discussed are provided. These include some of 
those holding in the preceding case, as well as those holding when mutuality 
of outcome is converted to mutuality of contingencies. 

By and large, these define the conditions under which consent can 
be meaningfully obtained. They by and large define capability for consent. 

What about the retarded, the illiterate, people who do not under- 
stand the language, and so on? Illiteracy and differences in language 
would seem to be governed by unequal-availability of occasions, which 
was discussed under genuineness of choice. There exists a more readily 
available guide which covers these cases as well as the retarded and 
other "incompetents". This derives from consideration of the social and 

14-67 



commercial cm. If we apply the simple rule of concordance of accepta- 
bility of consent in the ordinary contractual case to the acceptability 
of consent in the cm. governing individuals and patients, few special 
rules seem necessary. Consent to the terms of a car contract signed by 
imbeciles would not be binding on them, nor should consent to treatment 
or research contracts be binding on them. The professional who proceeds 
under the assumption of validity of consent will face the same problems 
in a court of law as a car salesman who proceeds likewise -- and will 
probably face problems more severe if the harm is greater. And the 
same holds for a person speaking a foreign language. Courts have 
defined other situations as well. Institutionalization in a mental 
hospital does not deprive mental patients of certain privileges and 
rights of citizenship, including freedom to enter into or decline certain 
programs. Whatever genuine surprise is engendered by judicial opinions 
which question treatment/research consent under such conditions is 
probably derived from the fact that the professionals are not attuned 
to the applicability of contractual arrangements to their bailiwicks, 
rather than from their ignorance of the contractual relations involved. 
They encounter these daily as members of a complex commercial -industrial 
society. 

It would seem that attention to concordance with conventional con- 
tractual relations obtaining outside would eliminate at least some of the 
confusion surrounding the area. Whether the contracts are for time/effort 
or for outcome, the requirements on each party might be stated explicitly, 
as they often are outside. Where the issue is disclosure of data obtained 
during treatment or research, for research publication or for didactic 
presentation to improve treatment, and there is possibility of damage 

14-68 



through identification, or invasion of privacy, or in other ways, the tort 
law prevailing on the outside, for damages unrelated to contractual ful- 
fillment, might be considered. Or where contracted disclosure was 
violated, the breach of contract model might be considered. 

It is probable that the laws and social arrangements are changing in 
these areas, even as the social contingencies they reflect are changing. 
Time-honored models whose definitions are implicit rather than explicit 
(e.g., intent and fiduciary models) make related social policies subject 
to varying interpretations and therefore to abuse by those in power who 
are so inclined. These models are gradually being joined by more explicit 
models, and the resultant confusion provides no fixed guides. In such 
cases, solutions to problems in the area of patient-subject protection 
may provide precedents and help provide solutions for a society that 
needs all the help it can get. 

In the meantime, we might profit from its past efforts and solutions. 
But this interchange can best be facilitated if the models applied to 
our areas of concern are consonant with those the rest of the social 
order is finding to be of increasing applicability. And these models 
include the contributions of the scientific systems of consequential con- 
tingency analysis found in behavior analysis, transactional analysis, 
exchange theory, decision theory and cost-benefit analysis; the con- 
tributions of the legal systems faced with requirements for expl icitness; 
and the contributions of the larger and equally explicit social contractual 
models they all reflect. 



14-69 



References Cited 

Barber, Bernard. Experimenting with humans. Public Interest , 1967 
(No. 6), Winter, 91-102. 

Beecher, Henry K. Scarce resources and medical advancement. Daedalus , 
1969, 98 (2), 275-313. 

Beecher, Henry K. Research and the individual . Boston: Little Brown 
and Co. , 1970. 

Blumgart, Herrman L. The medical framework for viewing the problem of 
human experimentation. Daedalus , 1969, 98 (2), 248-274. 

Carlson, Rick J. The end of medicine . New York: John Wiley, 1975. 

Currie, Elliott P. Crimes without criminals: witchcraft and its control 
in Renaissance Europe. Law and Society Review , 1968, 3 (1), 7-32. 

Edsall, Geoffrey. A positive approach to the problem of human experi- 
mentation. Daedalus , 1969, 98 (2), 463-479. 

Eisenberg, Leon. The child. Experiments and research with humans : 

values in conflict . Washington, D.C.: National Academy of Sciences, 
1975, 94-99. 

Fairweather, 6.W. , Sanders, D.H., Maynard, H. and Cressler, D.L. Community 
life for the mentally ill . Chicago: Aldine, 1969. 

Freund, Paul A. Introduction to the issue "Ethical aspects of experi- 
mentation with human subjects." Daedalus , 1969, 98 (2), viii-xiv. 

Freund, Paul A. Legal frameworks for human experimentation. Daedalus , 
1969, 98 (2), 314-324. 

Goldiamond, I. Perception. In Arthur J. Bachrach (Ed.) Experimental 

foundations c f clinical psychology. New York: Basic. Boo';s, 1962, 

280-340. 
Goldiamond, I. Moral behavior: a functional analysis. Psychology Today, 
1968, 2 (4), 31-34, 70. 

14-70 



Goldiamond, I. Toward a constructional approach to social problems: 
ethical and constitutional issues raised by applied behavior 
analysis. Behaviorism, 1974, 2 (1), 1-84. 

Goldiamond, I. Alternative sets as a framework for behavioral formulations 
and research. Behaviorism, 1975, 3 (1), 49-86 (a). 

Goldiamond, I. Singling out behavior modification for legal regulation: 
some effects on patient care, psychotherapy, and research in general. 
Arizona Law Review , 1975, 1_7, 105-126 (b). 

Gray, Bradford H. Human subjects in medical experimentation : a socio- 
logical study of the conduct and regulation of clinical research . 
New York: John Wiley, 1975. 

Hearings before the Subcommittee on Health of the Committee on Labor and 
Public Welfare, U.S. Senate, 93rd Congress, First Session. Quality 
of Health Care -- H u man Experimentation, 1973 . Washington, D.C.: 
Government Printing Office, 1973. 

Hendershot, C.H. Programmed learning: A bibliography of programs and 

presentation devices . Bay City, Michigan: Hendershot, 1967. Supple- 
ments 1967, 1963, 1969. 

Hendershot, C.H. Programed learning and individually paced instruction . 
Bay City, Michigan: Hendershot, 1973. 

Jonas, Hans. Philosophical reflections on experimenting with humans. 
Daedalus , 1969, 98 (2), 219-247. 

Keehn, J.D., Kuechler, H.A., Oki , G. , Collier, D., and Walsh, R. Inter- 
personal behaviorism and community treatment of alcoholics. 
Proceedings o f the Fi rst Anr. ual Alcoholism Conference of the National 
Institute on Alcohol Abuse and Alcoholism: Research on Alcholism: 
Clinical Problems and Special Populations . Rockville, Md. : National 
Institute of Alcohol Abuse and Alcoholism, NIMH. 153-176. 

14-71 



Ladimer, Irving. Ethical and legal and aspects of medical research on 
human beings. In Irving Ladimer and Robert W. Newman (Eds.) 
Clinical investigation of medicine: Legal, ethical and moral aspects . 
Boston: Boston University, 1963, 189-194. 

Levine, Robert J. The boundaries between biomedical or behavioral research 
and the accepted and routine practice of medicine. Paper submitted 
to the National Commission on the Protection of Human Subjects in 
Biomedical and Behavioral Research, Washington, D.C., July 14, 1975. (a) 

Levine, Robert J. The boundaries between biomedical or behavioral research 
and the accepted or routine practice of medicine. Addendum to paper 
submitted to the National Commission on the Protection of Human 
Subjects in Biomedical and Behavioral Research. Washington, D.C. 
September 24, 1975. (b) 

Markle, Susan M. Good Frames and Bad . Chicago: Tiemann Associates, 
1975 (3rd edition). '*-■ 

McDermott, Walsh. The risks of research. Experiments and research with 
humans: Values in conflict . Washington, D.C: National Academy of 
Sciences, 1975, 36-42. 

Mead, Margaret. Research with human beings: a model derived from anthro- 
pological field practice. Daedalus , 1969, 98 (2), 361-3S6. 

Moore, Francis D. A cultural and historical review. Experiments and 
research with humans : Values in conflict . Washington, D.C: 
National Academy of Sciences, 1975. (15-30). 

Moore, Jay. On the principle of operationism in a science of behavior. 
Behaviorism , 1975, 3 (2), 120-138. 

Parsons, Talcott. Research with human subjects and the "Professional 
Complex." Daedalus , 1968, 98 (2), 325-360. 

Peters, R.S. Ethics and education . Glencoe, 111: Scott, Foresman and 
Co., 1967. 

14-72 



Reichenbach, Hans. The rise of sci entific philosophy . Berkeley, Cal . : 

University of California, 1951. 
Robbins, Frederick C. Overview. Experiments and research with humans : 

values in conflict . Washington, D.C.: National Academy of Sciences, 

1975, 3-7. 
Sherman, J.G. (Ed.). Personalized system of instruction . Menlo Park, 

California: W.A. Benjamin, 1974. 
Skinner, B.F. Contingencies of reinforcement: a theoretical analysis . 

New York: Appleton-Century-Crofts, 1969. 
Skinner, B.F. About behaviorism . New York: Knopf, 1974 
Weiner, H. Human behavioral persistence. Psychological Record , 1972, 

22, 84-91. 
Wexler, David,. The surfacing of behavioral jurisprudence. Behaviorism , 

1975, 3 (2), 172-177. 



14-73 



15 



BOUNDARIES BETWEEN RESEARCH AND THERAPY, 
ESPECIALLY IN MENTAL HEALTH 

Perry London, Ph.D. 
and 
Gerald Klerman, M.D. 



Boundaries Between Research and Therapy, Especially in Mental Health: 

Perry London 

University of Southern California 

and 

Gerald Klerman 

Harvard University 



Terminology and Scope of Treatments 

There is no universally accepted terminology covering the many 
kinds of treatment used in the field of mental health. The vast 
majority of such treatments, however, consist more or less completely 
of some kind of verbal dialogue between the person administering the 
treatment and the person who receives it. This is true of virtually 
all treatments which go under the titles of counsel in g, case work , 
insight therapy (including psychoanalys is in most of its forms and 
variants, and cl ient-centered or non-directive therapy), psychiatric 
or psychological interviews or consul tations , encounter groups , 
humanistic or existential therapy , Ra t i ona 1 - Emot i ve Therapy , Transactional 
Analysis , and most forms of group psychotherapy and behavior therapy . 

A second order of psychotherapeutic activity also uses verbal 
interactions as the main instrument of treatment, but in more dramatic 
or unusual forms than that of conventional conversation, and often 
combined with specific behavioral methods of rehearsal or with altered 
states of consciousness. Included in this category are Psychodrama , 



15-1 






Gestal t Therapy , Assertion Trainin g , Relaxation Trainin g , Sex Therap y , 
Desensi t ization , Implosi ve Therapy , Behavior Shapin g or Operant 
Conditioning , and Hypnosis . 

Finally, a third class of mental therapy makes active use of 
equipment or of physical manipulation of the body by variations of 
massage. Included here are Aversion Therapies , Biofeedback , Bioenergetic 
Therapy , and Rol f in q. The psychotherapeutic use of tranqui 1 izing, 
energizing, antidepressant, and other psychotropic drugs is also, 
logically, in this class of treatments, as is electro-convulsive 
therapy (ECT). In fact, there are some forms of aversion therapy 
which use drugs as the active agent for producing aversive responses, 
such as the treatment of alcoholism by Antabuse, and the combination 
of psychotropic drugs with verbal and other psychotherapeutic methods 
is increasingly common and very promising. Since the use of psycho- 
tropic drugs is already regulated by the FDA, however, it is not 
included in this discussion. Most aversion therapy uses mild electric 
shock as the repulsive agent, and this usage is not currently regulated. 

It should be noted, moreover, in discussions of this class of 
treatment, that Biofeedback , despite its use of often very sophisticated 
physiological recording equipment, is not strictly comparable to the 
other treatments which involve very specific manipulations of the body, 
either by inducing physical discomfort (aversion therapy) or by massage 
(bioenerget ics, rolfing). Biofeedback equipment simply records ongoing 
physiological processes and then gives the patient information about them 



15-2 



in the form of auditory, visual, or tactile signals. The patient can 
then learn to alter the body processes by learning how to manipulate the 
sensory signals. Taking one's own pulse or observing one's own breathing 
are literally forms of biofeedback. The therapeutic use of this 
technology, despite the equipment involved, may actually be more closely 
related to some simple forms of Behavior Therapy, such as Relaxation 
Training, than to treatments which manipulate the body. I have 
included it in this class only because most biofeedback treatment 
involves the attachment of electrodes to the body connecting it to 
complex machinery, which gives the whole thing an aura of scientific 
quality which can easily mislead patients, research subjects, and 
legislators into thinking it is either more dangerous or more effective 
than is necessarily the case. 

The listing above does not include all the named forms of mental 
health treatments by a long shot. Estimates vary up to 130 or more 
names. It does, however, include representatives of every kind of mental 
therapy used by psychiatrists, psychologists, social workers, nurses, 
counselors, and all those professionals and paraprofess ionals who 
claim expertise in this domain except neurosurgeons, whose work is not 
discussed here. Since there is no single term which adequately covers 
the field, moreover, I shall use the terms "mental treatment" (or 
"therapy"), "psychological treatment" (or "therapy"), and "psychotherapy" 



15-3 



interchangeably to refer to all treatment methods and classes in the 
entire field of mental health, except where specified otherwise. 

Distinguishing Research from Therapy 
I ntent 



From the perspective of protecting subjects, the first thing that 
distinguishes research from therapy is the intent of the subject. So 
the first mandate of the researcher is to explicate to the subject 
whether, and to what degree, the manipulations involved are aimed at 
getting knowledge, independent of whether they will help the person. 
You don't need to place the same burden on the therapist, since 
therapists' and subjects' aim always coincide anyhow — that is, the 
reasonable presumption is, when someone goes to the doctor, that they 
are going for some benefit to themselves, not for the primary purpose of 
benefitting the doctor by giving him information. 

The biggest problem arises when the intent of the subject is primarily 
to get help and the intentions of the therapist are mixed either by a 
compound of scientific and therapeutic motives or by the fact that the 
only treatments available are experimental, that is, new enough or 
controversial enough so that their suitability for the given case is 
doubtful. The latter case, of innovative or experimental treatments, 
subsumes the case of mixed therapist motives. The problem, in turn, 
reduces to: "When should we define a therapeutic activity as being 



15-4 



research , regardless of the declared intentions of patient or therapist 
to label i t 'treatment'?" 

For the purposes of the Commission, this says, establishing the 
boundaries between research and the routine and accepted practice of 
mental therapies does not require the definition of research, but only 
the definition of therapy, because we are saying that anything which 
purports to be therapy and is not routine and accepted as such, is 
automatically research. 

The issue of therapist or investigator intent is not logically 
important for our purposes. If someone intends that his work should 
be consi dered research, it j_s research, in terms of needing safeguards 
for protecting subjects^ regardless of whether its methods are intended 
to be therapeutic. But not all that is intended to be therapy, ]_s therapy, 
for these purposes. This means, in effect, that we do not need to define 
research at all. We can attack the problem of boundaries meaningfully by 
recognizing that the practical problem is that many therapeutic methods 
are well intended, but poorly established (in terms of safety, efficacy, 
and economy). One cannot demonstrate the efficacy of a therapy in terms 
of the intentions of its proponents, because nice guys, in addition to 
finishing last, may propose ineffective treatments. And they may even 
propose harmful therapies with the best of intent ions . No more can a 
therapy be considered routine and acceptable on the basis of authority. 
Only evidence will do. 



15-5 



Peripheral Problems 

Once it is clear that the central problem of boundaries can be 
settled adequately by limiting our inquiry to the definition of therapy 
and the assessment of routine and accepted practice within that definition, 
several problems which may be important in other contexts become 
peripheral here and may be dismissed from this discussion: 

Who pays whom, who asks for help from whom, whether the primary goal 
is the accumulation of knowledge rather than the assistance of an 
individual, is there a research protocol, all become irrelevant questions 
for our purposes. Nor is there any need to distinguish here between 
"research" and "experimentation", or to separate either of them from 
"experimental" or "innovative" therapy. Dictionary aside, from the 
vantage of protecting subjects, they are all the same. 

Distinguishin g Mental Health from Other Medicine 

The problem of "accepted and routine practice" in the field of 
mental health differs somewhat from the same problem in other fields of 
medicine for three reasons: 1) The number and variety of nonmedical 
people, in and out of the learned professions, who have legitimate 
input into this field is considerably greater than is true in any other 
branch of medicine. So routine and accepted practice cannot, in most 
respects, lean on the specific training, licensure, or certification of 
the practitioner to help define the behavior in question. 2) The specific 



15-6 



practices which can be defined as therapeutic overlap so much with 
equally valid definitions of them as educational, recreational, or 
religious, that it is presumptuous and impractical to try to restrict 
these practices completely to medical or therapeutic functions or 
functionaries. 3) The goals for which mental therapies, however 
defined, are sought, and the sensible criteria for deciding whether they 
have been achieved, are so diverse that they cannot all be contained 
within any definition of health short of the WHO definition, and 
that definition is too broad for legislative use. 

If these problems are recognized at the outset, it may then be 
reasonable to seek boundaries between therapy and research in the many 
contexts where the distinction between them can be meaningfully designed 
to protect the subjects of research, without trying to comprehend and 
include every context in which such distinctions are possible. The 
development of meaningful regulations in this connection might not, 
for instance, seek to restrict a church from conducting Transactional 
Analysis meetings for its members, even were it clear that the procedures 
involved are technically considered innovative or experimental therapy. 

A more immediate illustration, perhaps, is Institutional Review 
Boards, which automatically review anything that purports to be research. 
The question for them is: what things should they review that claim to 
be therapy, or that do not claim to be research? Evidently, they would 
have to review all training and demonstration grant proposals in which 



15-7 



the procedures to be taught or shown fall outside the scope of "routine 
and accepted practice". To do so, however, they would have to have 
therapeutic guidelines. At the present time, the only such things in 
the field of mental health are FDA guides for the use of drugs. There 
are no equivalents for psychotherapy or counseling. Without them, any 
I RB whose members were very knowledgeable about the state of the art in 
psychotherapy would find themselves hamstrung. The need for such 
guidelines, as we shall see, seems virtually inescapable if the 
protection of human subjects in this domain is to be meaningfully regulated. 

Outcome Cr i ter ia 

The problem of deciding when a therapy, treatment or training 
program has been sufficiently tested so that it is no longer experimental 
is, on the face of it, the same as the problem of when a drug achieves the 
status of acceptability, so that it no longer has to be considered 
experimental. By and large, the things at issue are safety , efficacy , 
and economy . In the case of therapy in the field of mental health, 
economy may be subsumed under efficacy because one of the most important 
criteria of acceptability in psychotherapeutic kinds of treatment is 
the length of time and amount of effort it takes for a treatment to 
work in comparison to other treatments and in comparison to nontreatment 
conditions. 

The problems of safety and efficacy in mental treatments are not 



15-8 



necessarily the same as with drugs — in general, efficacy is a bigger 
problem, safety a smaller one, and both more complicated. Both issues 
are joined as the problem of outcome cr iter ia , which has plagued the 
field of mental health since the advent of modern psychotherapy. 
That problem, stated briefly, is: 

What are the goals of psychological treatment? 

How can we tell whether they are being met? 

What dangers attend the treatment process? 
in the early history of psychotherapy, the goals of treatment tended 
to be clear. They were the relief of specific symptoms of neurosis, 
such as phobias and other anxiety states, the repair of hysterical 
conversion reactions, such as hysterical blindness or paralysis ; or of 
dissociative states, such as amnesia, the relief of disabling obsessional 
thought patterns and complusive rituals, and the restoration of good 
feeling in people incapacitated by depression. Insofar as such specific 
symptomatology is to be found in people who are given psychotherapy, relatively 
efficient outcome criteria can be established, because the clear definition of 
the problem permits a fairly clear determination of whether or not it has been 
relieved. A large proportion of neurotic and psychophysiological conditions 
are of this kind. 



Safety . Since the end of World War II, however, and more pronouncedly 
since the 1 960 ' s , when encounter groups became very popular through the 
offices of humanistic psychologists and the "human potential movement," 

15-9 



more and more psychotherapeutic activity has been undertaken for 
nonspecific conditions , where the people requesting treatment would not 
admit to specific problems of the kind contained in conventional 
psychiatric nomenclature. Some of these conditions were represented 
as general malaise, disaffection with one's circumstances, or unhappiness, 
that is, as existential problems which might properly lie completely 
outside the purview of mental health, in its technical sense. Others 
were represented as recreational, educational, or quasi-religious, 
that is, as the desire of people who were not only free of symptoms, 
but were even happy with their lives, to have therapy as a "positive 
growth experience" which, on the face of it, is even further removed 
from the domain of mental health technology. These conditions are 
matters of concern here because, while the definition of the problems 
in such cases, places them outside the arena of mental health, the 
methods which are applied to those problems may be potentially harmful 
to some of the people they are used on. 

The recreational or educational character of psychotherapy is 
comparable, in this respect, to elective cosmetic surgery. Your 
intention, in getting a "nose job," may be to get more beautiful, 
rather than to get healthier — but the surgeon's knife will do just 
as much damage one way as the other, if it slips. In fact, many of the 
"awareness enhancing" methods of the human potential movement were 
specifically developed as psychological treatments and were published 



15-10 



under the authorship of trained and licensed mental health practitioners. 
It would be specious to view them as anything other than mental therapies. 
This definition of th e method may necessitate that some such treatments 
have to be regulated under the outcome criterion of safety , even if the 
nonspecific character of the "problem" makes the positive efficacy of 
the treatment method irrelevant. In practice, this could mean that 
encounter groups of the kind run at "growth centers" such as Esalen, or 
Arica, or EST (Erhard Seminar Training), might all be subject to scrutiny 
as innovative or experimental psychotherapies, even though they do not 
claim to be mental health treatments and even though their customers do 
not claim to need or want mental health treatment. 

As if the foregoing were not complicated enough, from the vantage of 
practical regulatory measures, the very same logic might apply equally 
well to the increasing deliberate application of behavior modification 
principles to routine classroom teaching problems, such as the improvement 
of reading or arithmetic skills; and it could also apply to self improve- 
ment programs such as Weight Watchers, which increasingly makes 
deliberate use of behavior modification to help people control obesity. 
Indeed, applying the safety criterion to the methods in question may 
necessitate just such scrutiny, regardless of where those methods are 
to be used. 

The judicious application of the safety criterion would probably 
exempt both Weight Watchers and arithmetic teachers from regulation, 



15-11 



however, because enough research already exists predictably to show 
that the application of the behavior modification principles involved 
has very low probability of doing any specifiable damage to any arithmetic 
student or obese person under almost any circumstances. Existing 
research would be less likely to exempt encounter groups or EST, 
however, and the establishment of reasonable guidelines would not be 
easy for deciding "how safe is safe", that is, how much of what kind of 
harm is "allowable" to what percentage of people who undergo that 
"treatment". 

In principle, the safety problem with psychotherapy is the same as 
with drugs. But in practice, it is more complex and less ominous at the 
same time. Few people, if any, die from psychotherapy, or get grossly 
incapacitated, and the few who do, tend to do so by such slow stages 
that reasonable observers might attribute the damage to other circum- 
stances than the treatment. Even so, some people are harmed by 
psychotherapy, and more potentially can be harmed as mental treatments 
become more efficient, which they will — so the need to regulate the 
protection of research subjects must logically include the implementation 
of some means for regulating the safety of innovative mental therapies, 
however complex the problem is. Drugs undoubtedly kill and injure more 
people, but the determination of their safety is aided significantly by 
the fact that the damage they do tends to be more specific, more 
dramatic, and sometimes visible on other animals than human beings. The 



15-12 



safety of psychotherapies is a more complex problem of definition and 
of empirical determination — therefore, it is also a more complex 
problem of regulation. 

Efficacy is a bigger problem than safety in mental health treatments, 
because their downside risk is more likely that of being harmless and 
useless than of being very potent in either a beneficial or dangerous 
direction. Even so, the means by which efficacy is established for 
psychological treatment of all kinds is, in principle, essentially the 
same as the means by which effective outcomes are determined in any 
other domain — by empirical assessment of the relative precision with 
which a given technique achieves a predetermined result in comparison 
with all other conditions under which the same result is or is not 
achieved. The foregoing proposition states, in clumsier than usual 
language, the principle by which the syllogistic determination of cause 
and effect (If a, then b; if not b, then not a) is applied to all 
scientific problems. Translated into the specifics of mental health, 
it says: 

A psychological treatment is effective if it achieves its specified 
goals. The faster it achieves them, and the more people it achieves them 
on, and the more thoroughly it works on those people, the more effective 
it is. The comparisons involved are comparisons of the treatment in 
question to other possible treatments, including no treatment at all. 

The specifics of the kinds of cost-benefit analyses which would go 



15-13 



into the actual assessment of any given therapy are somewhat variable, 
but there is no need to pursue them in detail in this essay because the 
principles involved are well known and unequivocal: They are the funda- 
mental principles governing all scientific investigation — measurabi 1 ity 
and repl icabi 1 i ty „ For a therapy to meet the efficacy criterion, it 
must be measurably better than other treatments and than nontreatment 
by standards which permit independent observers, using the same methods 
he did, to disconfirm the results of the original investigator. When 
others have tried, and failed to disconfirm, then the efficacy of the 
treatment is established. Until then, it is not. 

This notion of efficacy has two important implications for the 
purposes of the Commission: 

1) With respect to mental health problems, it allows for the 
legitimacy of idiosyncratic, unconventional definitions of treatment, 
or improvement, or cure, provided only that those definitions can be 
subjected to the same empirical evaluation procedures as any others. 
This makes it possible, for instance, for Thomas Szasz to argue that 
the notion of mental illness is a banal fiction and still to propose 
treatment models which can be validated as effective mental health 
instruments. It separates the empirical problems of treatment from the 
theoretical problems of defining the discipline. 

2) It implies that the boundaries between research and practice may, 
for practical purposes, be established without concern for the intent 



15-14 



of the investigator or practitioner, if not the patient. If treatment 
efficacy is established only when repeated investigation by conventional 
scientific rules has failed to disconfirm a treatment's relative 
efficacy, and "routine and accepted practice" is routine and accepted 
because the treatments involved are effective, then the boundary between 
research and practice i s the degree to which the knowledge of efficacy 
exists . That knowledge is a complex, but inevitable function of the 
extent to which the relevant research has al read y been done and repl icated , 
not of the intentions of the particular scientist or therapist. 

In its most general sense, research means trying to find out something 
that you don't know, which makes intent seem critical. But from the 
vantage of social regulation, and from that of the scientific community, 
the definition of a research problem is not what you know about something, 
but what _i_s known about it. From their personal perspectives, little 
boys and girls poking around each other in the bushes are doing research 
on where babies come from. But from the vantage of the community, the 
question is not a proper subject of scientific research because the 
answer is already well known. By the same token, the definition of 
routine and accepted therapeutic practice , in any domain which is 
subject to scientific inquiry, depends on the extent to which the 
relevant scientific questions have already been answered. The more 
they have been answered, the more 

The stork brings them. 



15-15 



a given form of practice is routine and accepted. The less they have 
been answered, that is, the more the quest ions of efficac y are open to 
scient if ic inqui ry , the more a given form of practice becomes research , 
no matter what the intent ions of the practitione r may be . The size and 
quality of the body of inquiry addressed to those questions and the 
size and quality of the body of knowledge it has produced index the 
permeability of the boundary between the two. In the complex variable 
systems of the biomedical and behavioral sciences, far more than in the 
physical sciences and their applications, the assessment of that boundary 
is a matter for negotiation. For practical purposes, this means that the 
determination of the boundary requires continuous, conscientious, and 
sophisticated scrutiny, assessment, and reassessment of the scientific 
status of the treatment arts. 

Gu idel ? nes for Guidel ines 

The detailed means for best conducting that assessment are not ob- 
vious, nor is the process of expert negotiation and consensus which will 
best summarize and judge the scientific status of each mental therapy, 
disseminate the information in the form of guidelines for review boards 
and funding agencies, and assure the proper revision of those guidelines 
as new knowledge accrues and old biases surface. Perhaps a new office 
should be created within HEW for this purpose, and perhaps it could de- 
rive some guidelines for creating guidelines from the practices now used 
by the Food and Drug Administration for evaluating drugs and by N1H for 



15-16 



evaluating research grant proposals. There are some special problems 
connected with evaluating psychological treatments that may require some 
thoughtful innovation in regulations and bureaucratic procedures -- 
the unreliability of psychiatric diagnoses, for instance, makes it harder 
to be sure you have met your intended outcome criteria in any given re- 
search study than would otherwise be true, even if the specific results 
of this study are significant statistically. Additionally, there are 
often large variations in therapeutic procedures of a single kind, de- 
pending on personal qualities of the therapist unrelated to professional 
training or competence, which might further confound the interpretation 
of results from one experiment to another, even where the subject selec- 
tion criteria in both studies have been reliably the same. And such 
vagaries, among others, make the assorted biases of the people doing the 
evaluation and review much more influential, potentially, in their judg- 
ments of mental health treatments than might be true if they were evalu- 
ating drugs. And there are still other special problems. 

What does seem obvious, in any case, and despite the problems involved, 
is that there cannot be any meaningful protection of research subjects in 
the field of mental health research unless there is regulation of inno- 
vative, experimental, research-demanding mental health treatments. The 
classification of treatments in that box, separating them from routine 
and accepted practice, requires, in turn, the preparation of objective 
guidelines based on comprehensive, fair minded evaluation of empirical 
evidence, and routinely revised as new reasoning and new discovery dictate. 



15-17 



16 



LEGAL IMPLICATIONS OF THE BOUNDARIES BETWEEN BIOMEDICAL 
RESEARCH INVOLVING HUMAN SUBJECTS AND THE 
ACCEPTED OR ROUTINE PRACTICE OF MEDICINE 



John Robertson, J.D. 
December 31 , 1975 



This paper discusses the legal implications of physician activities 
that occur on the boundary between research and the accepted practice of 
medicine. After showing that no major legal consequences turn on the 
characterization of an activity as research or practice, the paper then 
discusses whether legal consequences should attach to the distinction, 
concluding with a general discussion of policy alternatives for innovative 
therapy. 

Boundary act I v 1 1 ies* requi re consideration in developing public policy 
for research with human subjects — because they subject patients under 
the guise of therapy to risky, untested procedures without the safeguards 
that apply to experimentation. The problem arises because physicians often 
undertake diagnostic or therapeutic procedures about which little is known 
and which deviate substantially from routine, accepted practice. This may 
occur because there is no known effective cure and the physician seeks a 
procedure helpful to the patient, or because the new procedure appears to 
be superior in cost, efficacy, or side-effects to the standard procedure. 
Because data establishing efficacy may be lacking, its use may be said to 
be experimental. The concern here is that untested therapies will be 
used without controlled clinical trials to the detriment of patients, 
and may even come to be accepted as standard therapy, when later experience 
shows that they are actually inefficacious or harmful. 

Current HEW policy views such activities as placing a subject at 
risk and hence subject to I RB review because they "depart from the applica- 
tion of those established and accepted methods necessary to meet his 

2/ 
needs." — In addition, whatever the physician's specific intent in 



A ln this paper the terms "boundary activity" and "innovative therapy" 
are used as synonyms. 

16-1 



employing the new procedure, the consequences are likely to resemble 
the consequences of activities done with a specific intent to do research. 
The application of an innovative therapy will often yield knowledge that 
affects treatment of future patients in the same situation. Also, 
experience with one or several patients may lead to publication, and thus 
for the physician approximate the consequences of the research enterprise. 

At the same time, however, treatment of all boundary activities as 
research poses conceptual and policy problems because an experimental 
intent may be lacking. The physician using an innovative therapy may 
have no research or experimental aims beyond helping his patient. If 
asked, he will say that he is engaged in therapy only, and intends only 
to treat this patient rather than conduct research beyond that involved 
in any diagnosis or therapy. Moreover, a public policy 

that treats all boundary activities as research will implicate the govern- 
ment in physician practices far beyond those directly funded by HEW or 
occur ing in HEW funded institutions, — and will intrude into the doctor- 
patient relationship far beyond current regulation. — 

The question to be addressed is what safeguards, if any, beyond those 
applying to ordinary medical practice are needed when a physician, through 
application of an unaccepted or untested procedure, attempts to confer a 
therapeutic benefit on a patient. Is every intentional departure from 
accepted practice to be considered research and subject to controls for 
research? Or can some instances of innovative therapy be 
distinguished from research and be treated separately? The answer 
lies in an examination of the risks created by boundary activities, 
the efficacy of current controls, and the Incremental costs and 

benefits of additional controls, such as those applied to federally 

16-2 



funded research. To illuminate these issues this paper first analyzes 
the legal implications of characterizing a medical activity as research 
or therapy and then considers the policy alternatives that follow from 
these impl icat ions. 

I. LEGAL CONSEQUENCES OF CHARACTERIZING MEDICAL ACTIVITY AS RESEARCH 
OR PRACTICE. 
While characterization as research or practice may ultimately have 
policy significance, at the present time it is reasonably clear that the 
labelling of a medical activity as research or practice has no major legal 
consequences in terms of who may engage in the activity, the circumstances 
under which a negligence award will be made, or the amount of information 
that must be disclosed to the subject of the activity. In the context of 
therapeutic activity that includes elements ofresearch or innovation, no 
question of who may perform therapy or research arises, for we can assume 
that activities of physicians and other appropriately licensed health 
professionals are involved. Moreover, there are no specific criminal 
prohibitions on doing research which legally distinguish research from 
therapy. The major points of difference, if any, concern liability and 
disclosure rules. 

A. Tort Liabi 1 i ty 

Aside from licensing and medical practice acts that restrict the 
persons who may practice medicine, and the general provisions of the 
criminal law, the primary legal constraint on physician activity arise 



16-3 



from the after-the-fact review and damage awards of the tort system. 
While conceivably different standards for ascertaining liability and 
imposing damages could apply, there appears to be no major difference 
between therapy and research in the standard for finding liability. 

1 . Liability for Accepted or Routine Practice 

A physician will be liable for damages if he fails to possess 
a reasonable degree of skill and to exercise this skill with ordinary 

care and diligence. What is reasonable and prudent care is usually determined by 

5/ 

the practice of other physicians in the same or similar circumstances, — 

though on occasion the courts have required a standard of care higher than 

6/ 
that of professional practice. — In general, then, a physician will 

incur no liability for use of a procedure, test or technique if he uses it 

in a nonnegl igenct way (that is, as carefully as other physicians in those 

circumstances), and it is considered by at least a respectable minority 

of physicians to be an accepted therapy in the patient's situation. 

2. Liability for Experimentation and Innovation 

While the earliest American cases involving medical experimentation 
or innovation seem to indicate that a physician will be strictly liable 
for any deviation from standard or accepted practice, even if done for the 
purpose of developing a better therapy, — there is now considerable sup- 
port for the proposition that liability for innovation depends on the 

reasonableness of the use of an innovative procedure in the circumstances 

8/ 
of the patient. — The reasonableness of deviation from the accepted or 

routine therapy will depend on the predicted condition of the patient, the 

probability of success of customary therapies, the probability of success 



16-4 



of the innovative procedure, and the probability, type, and severity of 

risks collateral to the therapy. The innovative departure will be 

reasonable if it reasonably appears that the chances of providing a 

benefit to the patient beyond that of customary therapy outweighs the 

likely risks of the innovation. As with a standard therapy, the question 

of liability depends on reasonableness of use: 

It does not follow from the fact that a method 
of treatment is innovative that it is not reason- 
able medical practice to use it. Expert testimony 
on this issue can evaluate the defendant physician's 
innovative therapy on the basis of the condition 
of the patient, the probability of success of the 
therapy, and the nature, severity, and the probability 
of collateral risks. Such expert testimony would 
be responsive to the fundamental and long-familiar 
inquiry: Did the defendant doctor conform to the 
standard of care of a reasonable practitioner under 
the circumstances confronting him. 9/ 

Although the liability rule is identical for activities characterized 
as accepted or innovative therapy, the factual inquiry occurring in each 
case will differ. In an action for damages arising from use of an accepted 
therapy, the factual inquiry will usually concern establishing standard 
practice, and proving that the physician in fact deviated from it without 
justification, that is, administered or performed the therapy in a 
negligent manner. With an innovative therapy, the factual inquiry will 
also concern establishing the accepted therapy, but then focus on the 
justification for departure from it: what was known of the innovative 
procedure, the likelihood of risks, and the grounds for thinking that it 
would "bring the patient a net benefit beyond that available with the 
accepted therapy. In this inquiry particular attention is likely to be 



16-5 



paid to the physician's consideration or use of customary therapies, the 
amount and type of prior investigation with regard to the innovative 
procedure, the results of animal research, if any, the conclusions that 
one can draw from general scientific principles, what the physician 
knew or should have known of those risks, and, in short, whether a 
reasonable practitioner, in the circumstances as established, would have 
been willing to undergo those risks to obtain the expected benefits. 
Thus, in the ordinary malpractice case the question of reasonableness 
usually will depend on whether the physician conformed to or deviated 
from the accepted standard of care. With innovative therapy, the 
question of departure is conceded and the question of reasonableness 
concerns whether the departure is justified given the patients prospects 
without it and the likelihood of a net benefit with it. 

A possible legal consequence could turn on the characterization of 
a boundary procedure as research or therapy, if research activities 
generally occurred only with the prior approval of an Institutional Review 
Board (IRB), as is now the practice for HEW funded research and, in many 
instances, for all research occurring in institutions receiving HEW 
funds. — Two possible legal consequences could turn on this practice: 
(1) immunity from liability if the IRB approves the activity and legally 
effective consent is obtained; (2) imposition of liability where IRB 
approval is not obtained. 

With regard to the first question, IRB approval alone would not 
provide immunity in a suit based on negligence in undertaking the innova- 



16-6 



tive procedure, even if the procedure were nonnegli genet ly performed, 
and legally effective consent were obtained. The claim here would 
be that it was tortious to undertake the procedure at all, even with full 
consent, and its legal resolution would depend upon the reasonableness 
of the experimental procedure - that is, whether the likely benefits to 
the patient - outweighed the risks. Although relevant and possibly 
persuasive, IRB approval alone would not determine the reasonableness 
of the activity. The IRB could have acted negligently or misjudged the 
risk-benefit ratio, and in any event, has no legal power to foreclose a 
court from independently determining reasonableness. In fact, the IRB's 
standard of reasonableness (do the sum of benefits to the subject and 
increase in knowledge outweigh the risks to the subject), — which take 
account of benefits to others, may well diverge from the standard applied 
by the courts. A persuasive argument, based on the law's concern with 
personal integrity, can be made that the courts should and would exclude 
nonsubject benefits in this calculus, and would view the risk-benefit 
ratio solely from the subject's perspective. Thus, while prior IRB review 
may be helpful in screening out "unreasonable" research, it is no guarantee 
that liability will not attach to procedures that it approves. 

Conversely, failure to obtain or the denial of IRB approval may be 
relevant and even persuasive evidence on the question of the unreasonable- 
ness of undertaking a research activity that occurred with legally valid 
consent, but again it is not determinative. The reasonableness of the 
procedure depends on the risks and benefits to the subject. Analytically, 
IRB review does not alter the risk-benefit ratio of the proposed procedure. H^.' 



16-7 



If the physician could establish that an activity characterized as 
research were reasonable in the circumstances, lack of IRB approval 
alone should not lead to liability. 

An exception to this conclusion could occur if IRB review were 

12/ 
mandated by statute. - — In that situation a court could find that 

violation of the statute was negligent per se, because the statute was 

designed to protect the class of persons in which the plaintiff is 

included, against the risk of the type of harm which has in fact occurred 

13/ 
as a result of its violation. — However, there would still remain 

open such questions as the causal relation between the violation and the 

harm to the plaintiff, - — and possibly such defenses as assumption of 

15/ 
the risk. — — The plaintiff would still have to establish that IRB 

review in this instance would have prevented the activity, either because 
it would have found the risk-benefit ratio unfavorable or would have 
required a fuller disclosure that would have occurred, which in turn would 
have led to nonsubject participation. If the risk-benefit ratio were in 
fact reasonable and legally valid consent obtained, it would be difficult 
to show that IRB review would have prevented the activity. If the risk- 
benefit ratio were unreasonable, or the consent was invalid, liability 
would exist independent of IRB review. Even if it did not, the plaintiff 
would still have to show that IRB review would have prevented the injury, 
possibly a difficult task with the current lack of empirical data on IRB 

effectiveness in preventing harmful research or actually improving consent 

16/ 
procedures. — — 



16-8 



B. Consent and Disclosure Requirements 

In addition to rules imposing damages for untoward results where a 
physician unreasonably deviates from the standard of care, another major 
legal constraint on medical activities are rules requiring physicians to 
disclose certain information about a proposed procedure for a patient's 
consent to be deemed effective. Technically part of tort liability, 
consent is sufficiently important to warrant separate consideration. 
However, analysis again reveals that with one possible exception disclos- 
ure rules do not vary with the characterization of a boundary activity 
as therapy or research. 



17/ 



Generally, a physician may not treat a patient without consent. 
In determining the effectiveness of a patient's consent, the question 
arises of how much information concerning the proposed procedure must be 
disclosed in order for the patient's consent to be valid. Traditionally, 
the rule has depended on the customary disclosure practice of the 



profession for the given situation. Generally, the plaintiff has the: 

burden to prove by expert medical evidence what 
a reasonable medical practitioner of the same 
school and same or similar community under the 
same or similar circumstances would have dis- 
closed to his patient about the risks incident 
to a proposed diagnosis or treatment, that the 
physician departed from that standard, causation, 
and damages. 

19/ 
Recently, with Canterbury v. Spence — — and a subsequent line of 

20/ 

cases, — a minority of jurisdictions have begun to apply a new dis- 
closure rule, based not on professional practice, but on the amount of 



18/ 



16-9 



information which a reasonable person in the patient's circumstances 

would want to know in deciding to undergo the treatment: — ■ 

(T)he standard ... is conduct which is 
reasonable in the circumstances . . . the 
test for determining whether a particular 
peril must be divulged is its materiality 
to the patient's decision: all risk 
potentially affecting the decision must 
be unmasked. The topics importantly de- 
manding a communication of information are 
the inherent and potential hazards of the 
proposed treatment, the alternatives to 
that treatment, if any, and the results 
likely if the patient remains untreated. 
The factors contributing significance to 
the dangerousness of a medical technique 
are, of course, the incidence of injury 
and the degree of harm threatened. 

In sum, liability for nondisclosure of the risks and other material 
details of accepted or routine care will depend on the jurisdiction In 
which the nondisclosure occurs. In either case the plaintiff has the 
burden of establishing the information required to be disclosed under 
either the professional practice or reasonable person standard, that such 
information was not disclosed, and that had disclosure occurred, the 
plaintiff would not have undergone the therapy. 

2 - Disclosure in Research and Experimentation 

While there are few precedents concerning disclosure requirements 
for research or experimental procedures as such, and cases in two jurisdictions 

suggest that the experimental or innovative nature of a procedure should 

22/ 
always be disclosed, — it appears that the disclosure rule for accepted 

therapy would also apply to Innovative or experimental procedures. Thus 

in a jurisldction requiring conformity to professional custom, the experi- 



16-10 



mental or innovative nature of the procedure, its specific risks and 
benefits, and the risks and benefits of alternative procedures would be 
disclosed only if the custom or practice of physicians in that situation 
was to disclose such information. A precise answer to the question 
what must be disclosed would thus depend on an empirical inquiry with 
regard to each use of innovative therapy, and whether a local, similar 
community, or national custom of practice were applied. Presumably, at 
least in some instances, medical practice could include as full or even 
greater disclosure than occurs under the Canterbury reasonable person dis- 
closure standard, but this would vary with the procedure and the particular 
circumstances of its use. 

In a Canterbury- type jurisdiction the fact that a procedure is innova- 
tive or therapeutic, its risks and benefits, and the risks and benefits of 
alternative procedures, would be disclosed only if a jury or court in its 
after-the-fact review concluded that such information would be material 
to the decision of a reasonable person in the patient's circumstances 
whether to undergo the procedure. Arguably such data would be disclosed 

under this standard, though the courts have not yet directly confronted 

23/ 

whether the innovative nature of a procedure must also be disclosed. — 

Elements of consent required by HEW — for research which it directly 
funds would probably have little impact on disclosure requirements in a 
Canterbury- type jurisdiction, since those elements would appear material 
to a patient's decision to consent and hence legally required. However, 
they could b e persuasive evidence of professional practice regarding 
disclosure in jurisdictions requiring disclosure in conformity with 



16^11 



professional custom. Despite some ambiguity, the HEW regulations appear 
to require the 1KB to assure that consent will not only be 

or/ 26/ 

legally effective, — but also will be "informed," which is defined — 
to include disclosures that would clearly go beyond professional custom 
disclosures, specifically the fact that the procedure is experimental, as 

well as disclosure of discomforts, risks and benefits of the procedure, 

27/ 

and alternatives. — Assuming in a professional custom jurisdiction 

that the HEW consent rules were generally followed by the profession 
for all research, a court could 

find that the HEW disclosure requirements defined the professional 
custom and hence the disclosure rule for experimentation. If that were 
the case, then in a professional custom jurisdiction characterization of 
a procedure as experimental could have legal significance with regard to 
liability for nondisclosure. But such a conclusion would depend on 
showing that the procedure in question was in fact experimental, and 
that a custom of submitting all experimental procedures, whatever their 
funding source, to I RB review, existed. If, as is more likely, the 
custom could be established only for research directly funded by HEW or 
occurring in HEW funded institutions, then the HEW disclosure standard 
would not appy to all innovative procedures occurring in that jurisdiction. 
Thus, while a more stringent disclosure requirement for research might 
exist in a professional custom jurisdiction, this standard would most likely 
apply only to research In HEW funded institutions, and only then if a 
court accepted this argument. 



16-12 



I I . The Need for Special Protection in Boundary Activities 

Since experimentation is not a legal category with separate 
liability and disclosure rules, there are presently no significant legal 
consequences that hinge on a boundary activity being characterized as 
research or therapy except for a possibly more stringent disclosure re- 
quirement in certain circumstances. Moreover, even if research or experi- 
mentation had legal significance as such, legal consequences beyond those 
applicable to ordinary therapy would attach to boundary activities only if 
they were always regarded as research. As discussed below, there are 
sound reasons for not treating every application of innovative therapy as 
research. 

The question remains whether legal significance should attach to 
boundary activities, no matter how they are characterized. This could 
result from creating special rules for experimentation, and treating some 
or all boundary activities as research. Or rules more stringent than 
for accepted therapies and less restrictive than for research^ could be 
legislatively or administratively devised to regulate boundary activities. 
Alternatives here include criminal prohibitions, disclosure and liability 
rules, and prior or after the fact review. Before considering such 
alternatives, however, it is necessary to consider whether boundary activi- 
ties (1) create risks to patients beyond those of ordinary medical practice, 
and, if so, (2) whether existing legal and peer review mechanisms 
provide patients sufficient protection. If risks to patients greater 
than the risks of accepted practice exist, and they are not sufficiently 



16-13 



controlled by existing mechanisms, then consideration should be given to 
alternative techniques for controlling them. 

A. The Problem of Innovative Therapy: The Risks 

An important issue is whether boundary activities, which share 
features of ordinary practice and innovation, create risks to the patient 
beyond those that exist in the application of accepted, routine therapies. 
If so, are those risks so similar to the risks of physician conflict and loyalty to 
future patients existing in pure research that they require similar treat- 
ment? Boundary activity or innovative therapy may create additional risks 
in at least three ways. 

First, simply because a procedure is new or sufficient experience is 
lacking, a patient may be subjected to a risk greater than occurs with 
standard therapies. In the case of the latter, the risks to the patient 
are that through ignorance, intent, or negligence a procedure will be 
unnecessarily applied; that it will be applied in a negl i gent manner; 
that it will be ineffective; or that it will cause anomalous injuries or 
results. Generally, however, a therapy is standard or accepted because its 
risks are known and there is some basis for thinking that on balance its 
application will benefit the patient. A boundary activity, on the other 
hand, subjects a patient to these risks and more. For while with an 
accepted therapy the patient has some reasonable expectation of benefit, with 
innovative therapy the risk is greater that the therapy will not work or 
that it will have harmful effects of its own, if only because its effects are 
unknown. These risks are greatest with the first use of an innovative 



16-14 



therapy, but continue to be substantial until sufficient data on its effects 
exists. There is also a greater risk that the therapy will be applied 
negligently or without adequate skill, because due to its newness, 
physicians have not become skillful in applying it.-^ There is also a 
greater chance of anomalous results occurring if only because it will not 
yet be known which patients are subject to anomalies. These risks 
include both the loss of an alternative, accepted therapy (though inadequate), 
and injuries directly caused by application of the new therapy. 

While some added risk appears likely because of less experience with 
an innovative therapy, it is a question for empirical research how signifi- 
cant this additional risk is. Many accepted therapies have never been 
validated as effective, and to some extent, may impose risks similar to 
those of innovative therapy. On the whole, however, it seems reasonable to 
differentiate accepted and innovative or boundary activities by the knowledge 
that is known about their likely risks and benefits. Serious deficiencies 
in our knowledge of the effectiveness of standard therapies does not 
change the fact that in using a therapy that is relatively unknown, the 
risks of injury or ineffectiveness is apt to be greater. 

A second type of risk in boundary activities is that the physician's 
decision to undertake the procedure and his disclosure to the patient may be 
influenced by scientific, career and future patient factors rather than by 
the interests of the patient alone. These factors will lead him to undertake 
a procedure that imposes an undue risk (unfavorable risk-benefit ratio) on 
the patient, and perhaps to influence or manipulate the patient's consent. 
With accepted therapies, as debates over prepaid delivery systems and 



16-15 



utilization review show, factors such as profit, efficiency, or 
specialty orientation may also conflict with the patient's interest. 
While such decisions are deemed unethical and are decried by the medical 
profession, they may be inherent in the practice of any profession, and 
hence are left to professional discipline or tort remedies. 

With a boundary activity, which involves a departure from standard 
practice out of a sense that a better procedure exists, there exists, in 
addition to the conflicts inherent in any professional practice, the 
possibility that the physician's activities will be motivated or influenced 
in part by scientific or career aspirations, or by the desire to develop 
a technique that will benefit future patients. That is, the physician's 
decision, and his communication with the patient concerning it, will be 
influenced to some extent by personal or career considerations that go 
beyond the immediate interests of the patient, thus leading to a decision to 
employ an innovative therapy that would not have occurred if the patient's 
interests alone were considered. The recognition that the 
investigator's loyalties to the subject-patient were under great pressure 
from loyalties to future patients and career goals has led, in the case of 
experimentation, to the development of review and consent procedures to 
assure patients' interests do not suffer, presumably because existing 
control mechanisms were inadequate to protect patients. 

With boundary activities, the question thus is whether, and under what 
circumstances, patient or other interests are likely to perdominate. It 
may be that with many boundary activities the return to the doctor in terms 
of career and future patient goals is no different than in the application of 



16-16 



an accepted therapy, or that if some nonpatient concerns are present, they 

are neither so strong nor dominant as they are in formal research. In 

30/ 
other instances, such as the case of Florentino v. Wagner , — where a 

surgeon's decision to use an innovative spinal operation led to serious 

injury to several patients, the decision to use the innovative therapy 

and the information disclosed to the patient may be strongly influenced 

by the desire to develop a procedure at the expense of the patient. 

Since boundary or innovative activities may involve both poles of 

patient concerns, an important question is (1) ascertaining the 

frequency and (2) identifying the circumstances in which patient 

interests are likely to be secondary. 

In addition to increasing risks to the patient, a third potential 

problem with boundary activities is that they generally do not occur in 

a manner likely to maximize the reliability of data deriving from 

their use. Since a boundary activity involves a therapeutic use of a 

procedure whose efficacy or risks are still so unclear that it has 

not yet become accepted therapy, it is important from the perspective 

of future patients and medical science generally that reliable information 

be obtained about the activity's benefits, risks and efficacy. Without 

such data the future patient who receives or does not receive a particular 

innovative therapy is at greater risk than if the earlier uses of the 

therapy had occurred under circumstances and in a manner that would have 

maximized the chance of obtaining reliable data. It is unlikely, however, 

that most boundary activities maximize the chance of deriving reliable data. 

By definition, as it were, the physician will not conceive of his activity 



16-17 



as being experimental, and hence will not apply it in a methodologically 

30a/ 
sound way, for in most cases he thinks he is doing therapy. Even if 

nonpatient considerations are strong in the decision to use a therapy, 

at best the result will be a one patient experiment, whose outcome cannot 

always be meaningfully extended, even if it is disseminated, to other 

cases. There is also the danger that innovative therapies which are 

effective when used in an uncontrolled setting will appear successful and 

become accepted when they are actually harmful or ineffective. Once 

accepted, it is difficult to conduct the controlled trials to test 

their efficacy which may be desirable, and even necessary, to protect 

patient interests. The recent history of medicine contains several examples 

of innovative therapies being widely adopted for a period as standard 

because early uses did not occur in the context of methodologically 

sound clinical trials which could have yielded reliable data regarding 

31/ 

use of the therapy with future patients. — 

B. Adequacy of Present Controls 

While it seems reasonably clear that use of innovative therapy 
creates risks to the patient beyond those that exist in the ordinary 
therapeutic situation, risks which may be similar in kind to those that 
exist in research, and also creates the risk that maximum possible know- 
ledge will not be forthcoming from each instance of use, it does not 
follow that new controls must be devised for boundary activities. 
Rather, the adequacy of existing control mechanisms in minimizing these 
risks must be examined. Two types of controls that impinge on the use of 
innovative therapy will be examined to determine whether it is likely that 



16-18 



either or both provide physicians with sufficient incentives to minimize 
the risks to patients. 

1 . Tort Liabi 1 i ty 

The possibility of tort liability impinges on the use of 
innovative therapy in two respects. First, a patient injured from the 
use of an innovative therapy can seek money damages in a civil suit 

claiming negligence or malpractice in the decision to use the innovative 

32/ 
therapy. — Since the physician will by definition have deviated from 

accepted, standard professional practice, recovery will depend upon 

whether reasonable, prudent care in the ci rcumstances would encompass use 

of the innovative therapy. If the risks to the patient from the therapy 

itself and the foregone alternative are greater than the likely benefits, 

then the physician will be liable whether or not the patient consented to 

33/ 
undergo those risks. — On the other hand, the physician will not be 

liable if he can show that it was reasonable to think that the benefits of 

the innovative procedure outweighed the risks, including the loss of 

benefits from foregone alternatives. 

Second, a physician could be liable for the use of an innovative 

therapy if he failed to disclose information required for legally effective 

consent. Depending on the jurisdiction, recovery here will depend on 

the amount of information disclosed. In the majority of jurisdictions 

the physician will be required to disclose only that information which 

physicians in that situation customarily disclose. Since it is unlikely 

that there will be a practice established concerning disclosure for the 

specific therapy involved, the question will be what physicians disclose 



16-19 



about innovative therapies in general, or about innovative therapy for this 

3 V 
type of disease. — A strong minority of jurisdictions, however, require 

that the physician disclose all information material to the decision of 
a person in the patient's position whether or not to undergo the procedure. 
Ordinarily the risks and benefits of the proposed procedure, the risks 
and benefits of alternative procedures, and probably, the innovative or 
experimental nature of the procedure would have to be disclosed. 

The question, thus, is whether the possibility of tort liability 
for unjustifiable uses of innovative therapy or for failure to disclose 
relevant facts will induce doctors to use innovative therapy only when it 
will reasonably provide a net benefit to the patient, and the patient 
consents. The ability of the tort system to achieve these goals must be 
questioned. The tort system is not calibrated to deal with every devi- 
ation from ethical conduct. First, it operates only after an injury 
occurs. Use of innovative therapy may be highly unethical as where 
the risk is much greater than any benefit to the patient, but unless 
the risk materializes, no tort remedy is available. Second, where the 
risk does materialize, a number of factors may operate to prevent a 
successful suit. The patient may be unaware of a wrong, the injury may 
not be worth the cost of litigation, he may be unwilling to sue, he may 
lack the resources, etc. Finally, if a suit is filed, the chances for 

recovery may be slim. Most malpractice cases are decided favorably to the 

35/ 
doctor. — The patient will have to show that he is worse off than he 

would have been if he had not undergone the innovative therapy, and this 

may be difficult. For all these reasons, the threat of a law suit and 

16-20 



legal liability may not prevent physicians from using innovative therapy 
in situations that ignore patient interests, if use otherwise seems 
justified. Of course, one might argue that the physician will be all 
the more careful when using innovative therapy precisely because the 
chances of liability are greater, but empirical data to evaluate this 
claim is lacking. 

Similarly, the law of informed consent will not necessarily assure 

that the patient will be informed to the same extent that ethical prac- 

36/ 
tice requires, or that would occur through some other control process. — 

First, the standard for disclosure will be considerably less in those 

jurisdictions that allow medical custom to define the limits of disclos- 

37/ 
ure,™ and even in Canterbury-type jurisdictions, it is not yet established 

that the innovative nature of a procedure must be disclosed. Second, 

whatever the standard for disclosure, implementing the standard legally 

will depend on the occurrence of injury, willingness and ability to sue, 

and establishing that if additional information had been disclosed, the 

patient would not have consented. Though not insurmountable, these 

are formidable barriers raising doubts about the efficacy of tort liability 

to assure ethical practice in disclosing relevant information about the 

use of an innovative procedure. 

Two further aspects about tort liability should be noted. The 

first is that in at least one respect, the tort standard of reasonableness 

based on a calculus of risks and benefits to the patients may be more 

favorable to the patient than the HEW standard employed by I RB ' s because 

benefits to future patients will probably not count, as the HEW standard 

allows. Second, the limitations of the tort system arise from the way 

16-21 



the system is presently constituted. Changes in tort liability rules 
that permit awards of damages on the showing of injury alone, or that 
otherwise facilitate suit, may well make the liability system an effective 
device for controlling the possible abuses of innovative therapy. 

2. Peer Review 

In addition to the incentives provided by the legal system to 
give primary weight in boundary activities to patient interests (that is, 
to judge the risk-benefit ratio in terms favorable to the patient), a 
variety of professional norms and review mechanisms also provide such 
incentives. In discussing them, the question to be kept in mind is to 
what extent they are likely to counter tendencies in the innovative 
therapy situation to disregard patient interests and thus assure an 
acceptable quality of care in these activities. 

a. Professional Ethics and Codes 

Professional ethics, as exemplified in codes and medical ethical 

writings, generally require loyalty to the patient, and do not sanction 

compromise of patient interests for personal or career goals, or even 

•30/ 
simply to advance science. — Since such codes and norms are generally 

hortatory, carry no specific sanctions, and may often not be clearly 
applicable to boundary activities, one might justifiably display skepti- 
cism as to their efficacy in assuring protection of patients in innovative 
therapy situations. No doubt many physicians have internalized and comply 
with these ethical precepts, but at present there is not substantial 
evidence showing that adherence to a code of medical ethics alone will 
prevent patient abuses in innovative therapy or improve the methodologies 



16-22 



with which they are used. 

b. Informal Peer Review Mechanisms 

Another mechanism that might provide incentives to apply 
innovative therapy in ways protective of patients are various informal 
and formal professional review mechani sms. Medical audits, utilization 
review, tissue committees, credential committees, academic rounds, and 

the like, all review physician decisions to some extent and presumably have 
various sanctions to induce compliance. To the extent that colleagues 
and review committees reviewed boundary activities and evaluated their 
ethical justification, a physician might be induced to make decisions 
with appropriate risk-benefit ratios and consent procedures, for fear 
of peer disapproval, censure, nonreferral s, or perhaps more stringent 
sanctions such as limitation or termination of hospital staff privileges. 

Without data available on the precise scope and details of these 
review mechanisms, it is difficult to evaluate their efficacy in 
minimizing the abuses of innovative therapy. However, a number of factors 
cast doubt on their efficacy. First, there is no guarantee that most 

boundary activities will come to the attention of peer review mechanisms. 

The frequency of review will vary with the setting and type of activity, 

39/ 
and no doubt may occure more often with surgical procedures — or 

practice in an academic setting. — Secondly, even if particular boundary 

activities are reviewed, one cannot be sure that the criteria and standards 

applied will coincide with the socially desired criteria. Professional 

standards as to when risks and benefits of an innovative therapy are 

appropriate might unduly weigh scientific and future patient interests 



16-23 



over those of the patient.' Also, medical audit and review programs 

k]/ 
do not generally look at the consent process. — Finally, peer review 

mechanisms do not always carry the sanctions that would induce more 

desirable behavior, though the potential for so doing could be there. 

One situation in which peer review mechanisms may be effective is that of 

the internally or externally imposed clinical moratorium on further uses of 

an innovative procedure, when great risks to patients become apparent. -- 

While the moratorium phenomena has operated effectively with innovative 

cardiac surgery, it appears subject to the same deficiencies as other peer 

review mechanisms. — In sum, various peer review mechanisms, if they 

exist at all, do not appear geared to review innovative therapy in a 

manner necessarily coincident with what is most socially desired. For 

this reason, they do not appear to provide sufficient incentives to 

assure protection of patient interests in innovative therapy situations. 

c. PSRO 

A brief word about the relevance of PSRO is in order, since once 

they are functioning, PSRO's will be the most comprehensive peer review 

W 
mechanism in operation^— Because of their nationwide scope and review of 

both institutional and outpatient care, they are likely to pick up more 

instances of innovative therapy than any other review mechanism. PSRO's 

also have the power of the purse to enforce their standards, since they 

may deny payment for inappropriate or unnecessary services. However, 

because PSRO review is limited to medicare and medicaid patients, most 

doctor-patient encounters will not be within their ambit. A key question 



16-24 



concerns whether PSRO standards will exclude payment for boundary 
activities that appear unjustified from the patient's perspective. 
Since norms will be set by physicians, this will depend on whether norms 
reflect patient or professional interests. Secondly, whatever the norms, 
their efficacy will depend on their implementation — on the willingness 
of PSRO's to take a firm stand against dubious professional practices. 
One can expect the more outrageous conduct to be 

penalized, but many cases of innovative therapy may not fall into that 
category. — Moreover, it is not clear that PSRO's will identify and 
prevent abuses of innovative therapy that slip by the tort system 
and other review mechanisms. In sum, while some peer review procedures, 
particularly PSRO's, may help define standards of acceptable practice, 
their efficacy in preventing or deterring unacceptable instances of in- 
novative therapy is unclear. Data is lacking on the extent to which they 
provide incentives beyond that of the tort system to honor patient interests 
in applying new therapies. 



16-25 



III. Alternatives for Control 

If one concludes that the risk to patients from boundary activities 
is significantly greater than with accepted practice and that tort and 
peer review mechani sms provide insufficient incentives to protect patient 
interests, then several alternatives for minimizing patient injuries from 
innovative therapy may be considered. Each alternative, however, has 
costs, ranging from the costs of administrating a review system to the 

costs borne by patients when an innovation beneficial to them is not 
available. With each alternative the inquiry is the same: do the 
benefits in patient protection, personal autonomy, and increased knowledge 

outweigh the costs. 

Before analyzing specific suggestions for improving tort and peer 
review mechanisms, it is necessary to consider whether boundary activi- 
ties should be thought of as experimentation. Whether a special set of 
rules or controls are to be applied to innovative therapy depends first 
of all on whether a special set of rules is to apply to clear cases of 
experimentation. Aside from activities specifically funded by HEW, there 
are at present no legal controls on experimentation or innovative therapy 
other than general principles of tort law, which appear to treat experi- 
mentation and therapy identically. Unless legal controls on experimentation 
are developed, it would seem a fortiori that no controls should be forth- 
coming for innovative therapy, since the risks they pose seem much less 
than those of experimentation. However, assuming that controls for 
experimentation are developed, either legislatively or administratively, 
a question remains whether (1) they should also apply to innovative 



16-26 



therapy; (2) a special set of rules for innovative therapy should be 
developed; or 3) innovative therapy should be treated like accepted 
practice. If the same rules are to apply to experimentation and inno- 
vative therapy, no problem of definition arises, for experimentation can 
be broadly defined to include at least all intentional deviations from 
customary practice. — If situation (2) or (3) applies — innovative 
therapy is to be treated differently from experimentation, either with 

or without a special set of rules — then criteria for distinguishing 

hi/ 
innovative therapy from experimentation must be developed. — 

A. Should Innovative Therapy Be Treated as Experimentation 
Assuming that through legislative or administrative action 
experimentation will become a distinct legal category with specific 
liability, disclosure or review requirements, the question is whether 
experimentation should be defined to include all intentional deviations 

from standard practice, including innovative therapy, as many current 

k8/ 
definitions and experts suggest.— Since an innovative therapy is 

usually insufficiently proven or tested to be established as effective, 

calling it experimental seems appropriate. Moreover, the incentives 

that exist in the clearly experimental situation to disregard the 

patient's good in order to advance the interests of the researcher or 

third parties, may also exist in a boundary activity, though they are not 

as likely to be present or, if present, to be as strong. Defining or 

treating innovative therapy as experimentation, with prior review by an 

I RB— will thus lead to risk-benefit calculations more favorable to 



16-27 



patients, will lead to more fully informed patients, and possibly will 
improve the reliability of data generated by innovative therapy by 
"experimentalizing" its use. — 

Requiring prior review by an I RB for all uses of innovative therapy, 
as well as for experimentation, however, may pose significant problems. 
Assuming the requirement is legislatively imposed, then IRB's will have 
to be constituted in numerous institutions and settings where they do not 
now exist. For while research may occur in limited settings, innovative 
therapy is likely to occur wherever medicine is practised. In addition, 
it is not clear that all uses of innovative therapy in an office practice 
can be brought under an I RB umbrella. Such a requirement would constitute 

a governmental intrusion into medical practice far greater than has yet 
existed. It is highly likely that the medical profession would resist an 
enactment of such legislation or would challenge it in court if passed. — 
Indeed, it is not at all clear that the dangers of innovative therapy are 
so great that the incremental benefit from I RB review would constitute 

the compelling state interest justification necessary if such legislation 

52/ 
is to be constitutional under Doe v. Bolton . — 

I f I RB review is required only for innovative therapy in institutions 

receiving HEW funds, problems still exist. First, if the requirement is 

the receipt of any HEW funds, then most hospitals and many physicians 

would qualify, if they receive Hi 11 -Burton, Medicare, or Medicaid funds. 

Secondly, existing IRB's would be hard-pressed to review every instance 

of innovative therapy given their present resources and workload. A 

permanently constituted review process, with staff, etc., 



16-28 



would be essential. This expense would probably be 

passed on to consumers, thus increasing the cost of health care. Third, 

some degree of intrusion into the doctor-patient relationship will occur, 

with additional constitutional difficulties as to Congress's power to 

condition federal grants on regulation of nonfunded activities.—^ I RB 

review could be limited to innovative therapy directly funded by the 

government, but then only a small percentage of boundary activities would 

53A / 
be regulated. 

In addition to problems of constitutionality, scope, administrative 

cost and implementation, two further factors cast doubt on the wisdom of 

requiring I RB approval of all innovative therapy, as many institutions now 

53B/ 
purport to do in the general assurances given DHEW. One is that 

despite similarities to experimentation, innovative therapy may be primarily 
therapeutic and for the benefit of the patient and,, only secondarily 
may involve the concern for science and future patients that creates the 
researcher's conflict of interest in experimentation. Such incentives 
may occasionally operate, but on the whole, they appear to be consider- 
ably diminished in strength and alone may not justify the tremendous 
costs and burdens of a prior review system, particularly when existing 
liability and disclosure rules will prevent the most egregious abuses. 
Secondly, these doubts are all the more compelling when we consider 
1 RB review wi 11 not necessarily assure more complete disclosure or 
better risk-benefit ratios for patients. No data establishing I RB 
efficacy in either regard now exists. In fact, available data suggests 



16-29 



that they may have little effect, particularly on improving the consent 
process. — Moreover, I RB balancing of total benefit against patient 
risk could put the patient's interests secondary to scientific advance- 
ment, though this may be only a theoretical concern. While IRB's in 
some places may be effective monitoring and protective devices, or may 
become so with certain changes, given existing data and the institutional 
context in which IRB's operate, one should hestitate multiplying them and 
expanding their scope at great cost unless there is a reasonable chance 
that they will achieve the goal s desi red. 

This position differs with Robert Levine's statement that "in general 
innovative therapy should be conducted and reviewed as if it were research."^' 
He further states: 

For practical purposes, the definition of 
research as provided in this paper, includes 
innovative therapy (or innovative practice). 
This means that any innovative practice in which 
the deviation from customary practice is substan- 
tive should be conducted so that it most closely 
approximates the standards of good research (as 
defined by the relevant scientific discipline) 
without obstructing the intent to bring direct 
health benefit to the patient-subject. It further 
means that the proposed innovative activity 
should be reviewed by an IRB, that the consent 
negotiation indicate that the activity is being 
performed with — at least in part — research 
intent, and so on. 50.' 

While recognizing that emergency- and nonsubstantive deviations— rom 
customary practice might not warrant treatment as research, Levine's 



16-30 



position rests on a particular definition of research and on the need 
to maximize knowledge from a particular use of an innovative therapy. 
This position seem erroneous in three respects. 

First, defining research as including all nonsubstantive deviations 
from customary practice seems overinclusi ve. As discussed more fully 
below, neither devi at ion from customary practice nor intent to obtain 
new knowledge adequately distinguishes research from primarily thera- 
peutic activities. Rather, the distinguishing feature should be a 
primary intent to obtain new knowledge beyond the needs of the patient. 
When applied to innovative therapy, this criterion will distinguish 
emergency and "nonsubstantive' 1 uses of innovative therapy, as well as 
substantive uses of innovative therapy which are primarily therapeutic 
in intent and only secondarily involve obtaining knowledge beyond the 
patient's needs. Use of untested therapies is certainly of concern, 
and may require special safeguards. But when their use is not influenced 
by interests contrary to the patient's needs there is no need to treat 
innovative therapy as research. 

Second, Levine may place undue emphasis on the need for studying 

all innovative practices systematically during the process of 

59/ 
innovation. — The goal is certainly a worthy one and should be 

encouraged. However, one should not be overly optimistic that I RB 
review will lead to better controlled uses of innovative therapy, 
without more evidence that they are capable of turning single uses of 
innovative therapy into controlled clinical trials^ Also, this 
concern places the interests that future patients have in safe, 



16-31 



efficacious therapies above the immediate interest of the patient and 
doctor in applying an innovative therapy. There may well be situations 
in which use of an innovative therapy is delayed or even denied, to the 
detriment of a patient, because the physician cannot readily experi- 
mentalize its use, in order to maximize knowledge from its application. 
Although better testing of innovative procedures is desirable, 
achieving that goal should be separated from the different goal of 
protecting patients from the conflicting interests of research 
si tuations. 

Third, Levine overlooks the legal and administrative problems 
that would arise if all nonsubstantive innovative therapy had to 
obtain prior approval of an 1RB. If review is required for activities 
other than those directly funded by DHEW, political, legal and 
constitutional problems arise, not to mention the cost and administrative 
difficulties in setting up new I RB ' s or overloading existing 1 RB ' s 
with substantially more business. Administrative difficulties alone 
should not prevent protection of human subjects. But these costs should 
not be incurred unless there is a reasonable certainty that they will 
actually produce greater benefits for patients. 

B. Should Innovative Therapy Be Treated Differently From 
Accepted Therapy 

If there are good reasons for hesitancy in treating all 
innovative therapy identical ly with experimentation, particularly in 
the respect of prior I RB review, the question remains whether there 
should be any special controls for boundary activities (though short of 



16-32 



the controls for research), or whether innovative therapy should be 
handled like accepted therapies. In either case, however, it will 
be necessary to define a boundary between research and innovative 
therapy, no matter how innovative therapy may be regulated. This 
section first discusses distinguishing innovative therapy from experi- 
mentation by the physician's intent, and then discusses the costs and 
benefits of various alternatives for dealing with innovative therapy. 

1 . Distinguishing I nnovati ve Therapy From Experimentation 

The criteria proposed to distinguish those activities that 
are to be regarded as research and subjected to a special set of 
controls, generally include three elements: (1) untested or unproven 
efficacy; (2) a deviation from standard or customary practice; and/or 
(3) an intent or aim to develop new knowledge. For example, the OHEW 
regulations, through a definition of "subject at risk" stress deviation 

from standard practice: — 

activity which departs from the application of those 
established and accepted methods necessary to 
meet his needs. 

62/ 
Robert Levine defines research both in terms of intent and deviation: — 

any manipulation, observation, or other study 
of a human being — or of anything related to 
that human being that might subsequently 
result in manipulation of that human being — 
done with the intent of developing new know- 
ledge and which differs in any way from 
customary medical (or other professional) 
practice. 



16-33 



go / 

Martin Norton focuses on lack of proof of efficacy and Intent: — 

Experiments can be described as: Those 
procedures that are untested or unproved 
with respect to clinical efficacy or 
are by their very nature not related 
to the therapy of the patient but 
rather performed solely for the purpose 
of obtaining scientific data. 

These definitions, which are typical of current attempts to 
define research- suffer from under or overinclusi veness. A definition 
is overinclusi ve if it is so broad that it encompasses clearly 
accepted medical procedures, as would occur if experimentation meant 
every use of an unvalidated or unproven procedure, as Norton and others 
suggest. Unvalidated practices may well pose risks for patients and 
deserve close scrutiny, but the fact that an accepted medical procedure 
used with therapeutic intent has not been reliably validated does not mean 
that it is experimental. While such a definition of experimental serves to 
call attention to the need for more thorough testing of ordinary therapies, 
it clashes with common usage and risks confusing the problems Q f insufficient 
testing with the quite different problems that arise when persons are 
used in biomedical experimentation. 

A second criterion of the experimental — deviation from customary 
practice — also appears overinclusi ve. One may deviate from standard 
practice for many reasons — out of ignorance, negligence, disagreement with 
the standard, or in an attempt to find a better therapy. Since we do not 
regard every deviation from standard practice as an experiment, this 
criterion will not do. Indeed, if it were sufficient, it would also be 
underinclusi ve, for it would exclude experiments with an accepted 



16-34 



therapy, though clearly one could conduct an experiment to compare the 

erf 

efficacy of two accepted therapies. — Of course, most instances of 
research or experimentation are deviations from accepted therapy. However, 
this seems due to the aim, intent or purpose with which they are done and 
not simply because they are a deviation from a customary practice. 

A third criterion focuses on the state of mind of the physician and 
asks whether there is an intent, aim, or purpose to develop data or 
knowledge. Even this criterion risks overinclusion unless qualified, for 
most tests and procedures in accepted therapy are done with the intent or 
aim of obtaining knowledge, such as knowledge about the patient, his body 
functions, the effect of a therapy, and the like. Furthermore, this 
knowledge is usually new, in that it was not previously known about the 
patient. Thus the intent necessary to define research cannot be the intent 
to obtain new knowledge, for that intent clearly characterizes therapeutic 
activities, as Moore, Norton, and others have recognized. — Rather, the 
intent must be to test or gather knowledge about a condition, test, outcome, 
or procedure beyond the needs of the patient, even though the patient may 
also benefit from the effort. The utility of this definition is that it 
focuses attention on interests and aims other than the immediate interests 
of the patient, which is why there is concern with experimentation. Thus a 
deviation from standard therapy which benefits a patient would be research 
if it would not be done if no intent to gather data beyond the needs of the 
patient existed, and would not be research if it would have been done 
absent an intent or purpose to gather data about the procedure beyond the 
immediate needs of the patient. A deviation from standard practice done 



16-35 



solely with the intent of benefitting the patient may amount to negligence 
or quackery if there is no reasonable chance of helping the patient. 

This definition should serve to distinguish those activities for 
which special protections are needed because nonpatient interests are 
paramount. Though the intent criterion applies both to conformity to 
and deviations from accepted therapy, it also distinguishes those instances 
of deviation from customary practice which should be treated as experi- 
mentation because of the presence of interests that conflict with those 
of the patient. Intentional deviations from standard therapy are thus 
considered research if done primarily with intent to develop new knowledge 
about the procedure or test, beyond the needs of the patient, and therapy, 
if done primarily with an intent to benefit the patient, and knowledge 
about the procedure itself is secondary. 

Two problems with the intent criterion should be mentioned. One 
concerns a distinction between general and specific intent. In law one 
is often held to intend the natural and probable consequences of one's 

act, even though one specifically intended or aimed only to do the act 

67/ 
producing those consequences. — Since a particular therapeutic use of an 

innovative therapy may naturally yield knowledge concerning use with other 

patients, one might argue that a general intent to use the therapy should 

be treated as an intent to derive knowledge for other uses, merely because 

such knowledge is a likely or natural and probable consequence of its use. 

Usually a physician will know that such knowledge will result, so that the 

possibility of a nonpatient benefit might, albeit subconsciously, influence 

his decision to use the therapy, even though at the time of use he specifically 



16-36 



intends only therapy and benefit to the patient. However, if an interest 
conflicting with the patient's operates only on the subconscious level, it 
does not differ from the physician's interests in extra income, time, etc. 
that may conflict with patient interests in situations of ordinary therapy 
and which arguably deserve n o special protection. The strongest case for 
treating the general intent to use an innovative therapy as equivalent to 
a specific intent to acquire knowledge beyond the patient's interests 
would exist in the first use of a drug or new surgical procedure. Here 
the development of knowledge is inevitable, 

. and here it is likely that the intent 

to gain new knowledge is strong, or at least equivalent to the therapeutic 

68/ 
intent. — Thus a standard of specific intent to produce new knowledge 

for use by others will identify most of the situations of innovative 

therapy that are of concern. Even if a special rule were justified for 

first uses of a new procedure, this would not change the fact that later 

uses may be specifically intended only to benefit the patient. 

A second problem with an intent criterion is its implementation. If 

the presence of such intent transforms a therapeutic situation into research, 

and thus touches off a need for prior review or other procedures, then a 

review system will be overdependent on the good faith of physicians, when 

their loyalty to patients is itself the issue. For a boundary situation 

to be subject to special controls, the physician will have to determine 

what his primary or specific intent is. If he determines that his intent 

is research, then he must submit the procedure to review or whatever other 

mechanisms exist. Such a system, it may be argued, lends Itself to abuse, 



16-37 



because physicians will have (1) an incentive in searching for their 
purpose to emphasize its therapeutic aspects, when research plays a 
dominant role; and (2) no sanctions can be applied for their failure to 
submit to a review process, even if the requisite intent is present, 
because it could never be established that they possessed a r-«-«»arch 
rather than a therapeutic intent. 

No doubt some physicians, as a result of this system, might be quick 
to downplay or deny nontherapeutic intent in boundary situations. At a 
certain level, however, every regulatory system is dependent on the good 
faith of the regulated. Defining all innovative therapy as experimentation 
would not, unless every physician decision were monitored, yield better 
results, because it would still be dependent on a physician recognizing 
or admitting that a procedure is, in fact, a deviation from standard 
practice and, then, deciding to submit it to review. As with the intent 
standard, the physician will have incentives to find that his procedure is 
actually recognized or accepted by some segment of the profession, or if 
that is impossible, of simply not submitting it to review.— Absent 
a monitoring system, there will not be any behavioral indication that the 
procedure is innovative rather than accepted, as there might be with 
clear-cut experimentation.— While the intent standard may pose compliance 
problems, those problems are not likely to be greater than would exist 
with a deviation from customary practice standard, which, as we have seen 
may be underinclusi ve anyway. It does have the advantage of drawing a 
fairly clear line, which each physician can personally feel (and if in 
doubt, can call research). Since any control system will have to rely 



16-38 



on physician compliance to an important extent, that fact alone should not 
render the intent standard unworkable. — 
2. Controls for Innovative Therapy 

If one agrees that all innovative therapy need not be treated like 
research, and that a boundary based on specific intent is a workable device 
to identify those instances of innovative therapy which involve research, 
the question remains whether therapeutic deviations from standard practice 
primarily to benefit the patient need any safeguards or controls in addition 
to those that apply to accepted therapy. 

a. Argument for No Additional Controls 

The argument for no additional controls would be that where the 

physician intends to use an innovative therapy primarily to benefit the 

patient, no special protection is needed because no nonpatient interests 

beyond those that ordinarily exist in therapy are operative. Rather, the 

risk is that through ignorance, misinformation, or negligence a physician 

will miscalculate the risk-benefit ratio and impose unreasonable risk on 

to some extent 
patients. However, this danger/exists in any therapeutic situation, and 

the physician will have the usual incentives to work for the benefit of 

tie patient. Moreover, he is likely to be especially wary of a lawsuit 

where a risk of injury is greater because of uncertain knowledge and hence 

will be more careful about obtaining consent and assuring that the patient 

stands to benefit. Particularly in jurisdictions requiring disclosure of 
the innovative nature of a procedure, the legal system already provides 
enough protection. Further controls would be an unnecessary and unwarranted 



16-39 



intrusion into medical practice. 

The validity of this argument rests on whether one thinks that 
sufficient incentives to respect patient interests and autonomy already 
exist, or whether because of lack of knowledge or deficiencies in the 
legal system, physicians are apt to miscalculate risks and benefits to 
the detriment of the patient. 

b. Additional Controls 

If one thinks that on balance physicians may, even when acting 
primarily for the patient's benefit, tend to miscalculate risks and benefits 
to the patient's detriment more often that would occur with accepted 
therapy, several alternatives to Improve their calculation exist 

(1) New Liability and Disclosure Rules 

One alternative would be to change current liability and disclosure 
rules, to assure that the physician accurately judges that potential 
benefits outweigh the risks, and that full disclosure occurs. Again, 
since it is unlikely that special liability and disclosure rules for 
innovative therapy would be enacted independently of such rules for 
experimentation the question is whether enacting special liability and 
disclosure rules for experimentation is warranted. With regard to 
liability, physicians engaging in experimentation could be strictly 
liable for any injury resulting from use of the experimental procedure, 
whether or not negligence occurred. The effect of this rule would be 
to internalize to the research project itself the costs of injuries now 
borne by the subjects.-^- If effective, it would force the researcher 



16-40 



(or institution) to calculate the chances of such injury and to determine 
whether this additional cost is outweighed by the benefits to be achieved 
by the research. Strict liability would thus be justified on the ground 
that the physician is in the best position to decide whether the likely 
benefits will outweigh the costs. 

Such a rule would be socially desirable if, in fact, physicians made 
fairly accurate predictions as to all the costs and benefits of an experi- 
ment, including benefits to future patients and the costs to subjects* and 
if they we re in a position to capture enough of the benefits to cover the 
costs they will incur if liable. If they are bad predictors, or if 
the benefits they capture do not outweigh their costs, even if all bene- 
fits outweigh their costs, then socially desirable research will not take 
place and future patients will unnecessarily suffer. 

A more precise analysis of a strict liability scheme for experimentation 
injuries, which is needed before such a rule can be recommended, is 
beyond the scope of this paper. The key question concerns whether such an 
approach will adequately compensate injured patients while not reducing 
research below a socially optimal level. Assuming such a rule existed 
for experimentation, the question is whether it should be extended to 
innovative therapy that is primarily therapeutic in intent. Again, 
the answer to this question will depend on whether such a rule will deter 
uses of innovative therapy that, on balancing risks and benefits, seem 
justified. Unless a nonfault or strict liability rule applies to all 
medical injuries, physicians may well avoid deviations from standard 
therapy aimed at benefitting the patient, because of fear of liability, 



16-41 



even though on balance the patient will be better off. 

A similar inquiry would occur if the new liability rule were less 
drastic, as would be a rule which shifted the burden of proof in cases 
of intentional deviation from standard therapy to the defendant physician, 
requiring him to prove that the likely benefits to the patient outweighed 
the risks. Such a rule might well induce doctors to be more careful in 
their use of innovative therapy, without preventing those applications 
of a new therapy in which the benefits outweigh the risks to the patient. 

Enactment of special disclosure rules for both experimentation and 
nonresearch innovative therapy poses fewer problems than do liability 
rules- In Canterbury-type jurisdictions, disclosure of all information 
material to a patient's decision to submit to experimental or innovative 
therapies is now the disclosure rule. Requiring a similar rule in all 
jurisdictions for both experimental and innovative procedures, if it is 
not already required because of a professional custom in having more 
complete disclosure for research, — should pose no major problems. It 
might increase the time a physician spends in obtaining consent, but the 
benefits thereby obtained seem greater. While more complete disclosure 
of risks might lead some patients to reject an innovative procedure which others 
would have chosen, this should not be of major concern, for the lost 
benefit will be a result of the patient's informed choice. 

At the very least, then, a disclosure rule should be enacted which 
requires that patients be informed of risks, benefits, and discomforts 
of experimental, innovative and alternative procedures, and the new or 



16-42 



experimental nature of a proposed therapy. Enactment of a strict liability 
rule for injuries resulting from experimental or innovative therapeutic 
procedures requires a more precise analysis beyond the scope of this paper 
and should be explored. Shifting the burden of proving the reasonableness 
of a procedure, however, poses fewer problems and could fruitfully be 
enacted now. 

(2) Improving Peer Review 

A second alternative, if one finds existing controls inadequate 
for innovative therapy primarily therapeutic in intent, would be to 
develop peer mechanisms that through review and feedback to the physician, 
induced physicians^'applyi ng innovative therapy to make better risk-benefit 
calculations and more complete disclosure to patients. Alternatives here 
range from education and development of precise norms and criteria for use 
of innovative therapies, to monitoring of physician activities on a 
continuous or random basis. The former may be a useful addition, but 
one should not be overconfident of its impact. The latter might be very 
effective, but involves tremendous costs and difficulties in arranging. 
If created solely for uses of innovative therapy, the costs may be hard 
to justify. Yet developing effective quality control mechanisms for all 
medical decisions is far from realization. Consideration should at 
least be given to developing and enforcing a practice of preuse 
consultation, and after-the-fact review of applications of innovative 
therapies, though the precise details of such a system await further 
study. 



16-43 



IV. CONCLUSION 

Public Policy for innovative therapy depends on the extent to which 
innovative therapy poses risks for patients beyond those that exist in 
ordinary therapy, and, secondly, on the efficacy of existing legal and peer 
review mechanisms in minimizing those risks. If one concludes that a 
special set of controls is needed, a major policy issue is whether all 
innovative therapy is to be regarded as research and subject to the controls 
applicable to research, or whether there are some instances of innovative 
therapy to which the controls of research need not apply. Since a physician 
may use innovative therapy primarily for the patient's benefit, with no 
intent to acquire knowledge beyond the needs of the patient, the career, 
scientific, and future patient, interests that call for special protections 
in research may often be absent. In those situations, distinguished by the 
specific intent of the physician, treatment of innovative therapy as research 
is unnecessary to protect patients from the conflicts of interest inherent 
in research. Requiring I RB approval for all innovative therapy would also 
raise serious administrative, political, and legal problems at a time when it 
is unclear that I RB review will substantially enhance patient interests 
and lead to more informed consent, where no research intent is present. 

Where there is a specific intent to acquire information about the 
procedure beyond the needs of the patient, it is appropriate to regard the 
physician as engaging in research. The intent to obtain knowledge may 
influence the physician's disclosures to the patient, and his decision to 
use the therapy. Innovative therapy in this situation should be subject 



16-44 



to the same controls as research, including prior I RB review and the same 
liability or disclosure rules. The key policy issue here is whether these 
controls will apply to all research, to research occurring in institutions 
receiving federal funds, or only to research directly supported by federal 
funds. Each alternative raises unique problems of scope, political feasi- 
bility and constitutionality which recommendations for controlling research 
should not ignore. 

Having divided the universe of innovative therapy into two classes on 
the basis of physician intent, the question remains whether primarily 
therapeutic innovative therapy should be subject to special controls or 
whether it should be treated like ordinary therapy. Assuming the former, 
these controls should not be more stringent than the controls enacted for 
research, because the risks are smaller. While further study of a strict 
liability rule for injuries occurring in primarily therapeutic innovative 
therapy is needed, shifting the burden of proof to the defendant physician may 
be more feasible. Requiring as complete disclosure as occurs for research 
in a Canterbury-type jurisdiction is clearly in order. In addition, the 
medical profession should be encouraged to develop clearer standards for 
using innovative therapy and review mechanisms that will informally monitor 
physician use of them. 



16*45 



FOOTNOTES 



J_. Congress, in establishing the National Commission for the Protection 
of Human Subjects of Biomedical and Behavioral Research, explicitly 
recognized the problems presented by boundary activities. It specifi- 
cally directed the Commission to consider "the boundaries between 
biomedical or behavioral research involving human subjects and the 
accepted and routine practice of medicine",- in carry- 
ing out its study of ethical principles, guidelines and recommendations 
to the Secretary of HEW. P.L. 93-3^8, Sec. 212(B) (i). 

2. hS C.F.R. Sec. 46.3(b). One could argue, however, that a departure 
from standard therapy does not place a subject at risk if there is a 
reasonable basis for thinking that only such a departure could benefit 
the patient. In that case such a departure would be one of "the 
established and accepted methods necessary to meet his needs," if in 
fact it is standard medical practice to depart from accepted therapies 
when there is no reasonable hope of success and the benefits of the 
non-standard procedure outweigh the risks. 

3.. There is currently ambiguity, if not actual confusion, as to whether 
DHEW has the authority to require that institutions receiving DHEW 
funds submit all research with human subjects, whatever the funding 
source, to the review procedure required for research directly 
funded by HEW. As a matter of practice, HEW presently appears to 
take the position that an institution's general assurances pursuant 
to k$ C.F.R. Sees. 46.1 - .22 must include an assurance that all 
behavioral and biomedical research, however funded, will be reviewed 
by an IRB and consent protected. 

However, the authority for this position is less than clear. Section 
212 of P.L. 93-3^8, directed the Secretary of HEW by regulation 
within 240 days to require entities applying for grants under the 
Public Health Service involving research with human subjects to give 
assurances that all research involving human subjects at the institution 
would be reviewed. The regulations issued pursuant thereto, 40 Fed. 
Reg. 1 185^-58 did not include such a regulation. Although one could 
argue that 45 C.F.R. Sec. 46.21(b)(2) accomplishes the mandated 
purpose, it is sufficiently ambiguous, and so clearly preceded 
P-L. 93 - 348, that it hardly seems to discharge the duty required of 
the Secretary. 

Assuming the existence of the regulation required by P.L. 93 - 348, its 
constitutional validity remains an open question. While Congress 

may attach conditions to its grants under the spending power, the 
Tenth Amendment would require that there be some limits on the condi- 
tions it may attach. Based on language in United States v. Butler , 
297 U.S. 1 (1936) one may argue that grant conditions must be 
reasonably related to the purpose of the grant, and cannot regulate 



16-46 



Footnote #3 continued 

activities which are not funded under the grant. If the courts so 
limit Congress 1 conditional spending power, P.L. 93~348 and similar 
attempts to regulate non-government funded research with human sub- 
jects would be unconstitutional. For a more detailed discussion, see 
"Comment, The Federal Conditional Spending Power: A Search for 
Limits," 70 Northwestern L. Rev. 293-331 (1975). 

4k Under Roe v. Wade , k\0 U.S. 113 (1973) and Doe v. Bolton, 410 U.S. 
179 ('973) » such intrusion would be unconstitutional unless a 
compelling state interest that outweighs the physician and patient's 
right to privacy in their relationship, can be established. It Is 
far from clear that the possibility of abuse in using innovative 
therapy is so frequent that its avoidance would constitute a suf- 
ficiently compelling state interest. 

5. A more technical formulation of the general rule is: "a physician 
has the obligation to his patient to possess and employ such reason- 
able skill and care as are commonly had and exercised by reputable, 
average physicians in the same general system or school of practice 
in the same or similar localities." Waltz and Inbau, Medical Juris- 
prudence 112 (1971); See also Louisell and Williams, Med i ca 1 
Malpractice 8.03-8.07 (1973). 

6. See, e.g. Helling v. Carey , 519 P. 2d 981 (Wash. 197 1 *). 

]_. Carpenter v. Blake , 60 Barb. 488 (S.Ct. N.Y. 1871); Smtth v. Beard , 
56 Wyo. 375, 11 P. 2d 260 (19**D; Hodgson v. Bigelow , 335 Pa. 497, 
7 A. 2d 338 (1939); Sawdey v. Spokane Falls and N. Ry. , 30 Wash. 3^9, 
70 P. 972 (1902); Jackson v. Burnham , 20 Colo. 532, 39 P. 577 (1895); 
Kershaw v. Tillbury , 214 Cal. 679, 8 P. 2d 109 (1932); Graham v. 
Dr. Pratt Inst. , Tbj 111. App. 91 (1911); Medical Exam of Indiana v. 
Kaadt , 221 Ind. 625, 76 N.E.2d 669 (1 9^8) . See generally, Krisanovich, 
"Medical Malpractice Liability and Organ Transplants," 53 U. San. Fran. 
L. Rev. 223, 272-277 (1971), and Waltz and Inbau, Medical Jurisprudence , 
pp. 179-202, on which this and the following paragraph are largely 
based. 

8. Waltz and Inbau, 1 90; Karp v. Cooley , 493 F.2d 408, 423-*»24 (197*0. 
Although some cases have referred to experimentation as a separate 
ground of liability, the evidentiary requirement for establishing 
liability remains whether a reasonable and prudent physician would 
have experimented in those circumstances. 

Karp v. Cooley , ^93 F.2d *»23- While this clearly applies to experi- 
mentation occurring in a therapeutic situation, its applicability 
to non-therapeutic situations is less clear. In those cases liability 
is likely to depend on the adequacy of consent. For the only 
reported Fnstance of damages awarded a volunteer for Injury result- 
ing from tests conducted soley for purposes of medical research, see 
Halushka v. University of Saskatchewan, [1 966] 53 D.L.R.2d 436 (1965) 
(Canada) (ineffective consent to anesthetic tests; injuries included 
"diminution of mental ability"; verdict for $22,500). In any event, 
this paper deals only with experimentation occurring in therapeutic 

si tuat ions. 

16-47 



Footnotes 

9_. Id. , Waltz and Inbau, 190. 
J_0. See note 3, supra . 
jj_. i»5 C.F.R. Sec. 46. 2(b)(1). 

11a. However I RB review might be said to alter the likelihood of the risks 
occurring, given an unfavorable risk-benefit ratio. This issue is 
treated in the discussion of causation that occurs later in this paragraph. 

12. According to the discussion in note 3, supra , this is not now the 
case, even for research directly funded by HEW. In any event, 
review is not now required by statute for all activity characterized 
as research, whatever the funding source. 

J_3- Prosser, Law of Torts , 200-201 (4th ed. 1971). 

]4_. Id. 

15 « Even if violation of the statute is found to be causally related to 
the plaintiff's injury, a plaintiff who provided a legally valid 
consent, depending on the information disclosed, could be found 
to have assumed the risk that injury would occur. For a discussion 
of assumptions of the risk, see Prosser, 434-457. However, Waltz and 
Inbau seem to view the matter differently. Op. cit., 1 99- 

1 6 . A similar analysis would apply if I RB review for research, though not 
statutorily required, was customary practice for (1) HEW funded 
research, (2) research in HEW funded institutions, or (3) all research 
whatever the funding source. Failure to conform to a custom of re- 
views would not in itself produce liability, though a court could 
hold that it was unreasonable. Questions of causation and defenses 

of such as assumption of risk and the problems they raise would still 
exi st. 

17 . See generally, Waltz and Inbau, 152-177, and sources cited therein; 
Louisell and Williams, Sec. 22.01. 

J8_. Wilson v. Scott , 412 S.W.2d 299 (1967). 

J_9. 464 F.2d 772 (D.C. Ca. 1972). 

20. See, e.g., Cobbs v. Grant , 502 P. 2d, (Cal. 1972); Cooper v. Roberts , 

286 A. 2d 676. (Pa. 1971); Wilkinson v. Vesey , 295 A. 2d 676 (R.I. 1972); 
Trogun v. Fruchtman, 207 N.W.2d 297 (Wis. 1973). 

2K 464 F.2d 787-788. 

22_. Fortner v. Koch , 201 N.W.702 (S.Ct. Mich. 1935); Fiorentino v. Wagner , 
227 N.E.2d 296 (N.Y. 1967). 

2_3_. Presumably the two jurisdictions which adopted the rule, see note 22, 
supra , would continue to require such disclosure, even though these 
statements occurred before adoption of a Canterbury-type disclosure 
standard. 

16-48 



Footnotes 

24. 45 C.F.R. 46.3(c) (1-6). 
25_. 45 C.F.R. 46.2(b) (3). 

26. 45 C.F.R. 46.3(c) (1-6). 

27 . Depending on the precise disclosure rule in effect in a non-custom 
jurisdiction, HEW rules could require more disclosure than would 
occur even under a reasonable person standard. 

28. See pp. 28-45, infra . 

29 . This is especially true with surgery. 

3.0. 227 N.E.2d 246 (N.Y. 1967). 

30a . He also may intend to experiment with this procedure, but intend 
only a one-patient experiment, rather than undertake to develop a 
formal clinical trial. 

31 . A widely noted example was the development of portacaval anastomasis 
for bleeding esophageal varices which when finally tested was found 

to lack the supposed efficacy. Warren, "Controlled Clinical Research: 
Opportunities and Problems for the Surgeon," 127 Amer. J. Surgery 
3-8 (1974); Spodnick, "Numerators without Denominators: There is no 
FDA for Surgeons," 232 JAMA 35-36 (1975); Strauss, "Ethics of Experi- 
mental Therapeutics," 288 N.E.J.M. 1183-1184 (1973). 

32 . Since there is a greater risk of unskillful application with a new 
procedure, a finding that unskillful application due to newness is 
negligent would also be a disincentive to use. Aside from this 
possibility, the possibility of damages because the innovative 
procedure may also be negligently applied would not appear to create 
additional disincentives to use. 

33 . Waltz and Inbau suggest that the plaintiff's assumption of the risk, 
as manifested in a legally effective consent, would not bar recovery 

if use of an innovative procedure is unreasonable in the circumstances, 
See Waltz and Inbau, 1 99- 

V\_. The disclosure custom will also depend on the effect given the HEW 

regulations as evidence of a disclosure practice. See pp. i>- ' , supra . 

35 . Report of the Secretary's Commission on Medical Malpractice, 5-20 
(1973). 



16-49 



Footnotes 

36. Of course, physicians may disclose more information than the law 
requi res. 

37 . This statement assumes that the HEW regulations will not be taken 
as evidence of disclosure practice. 

38 . See e.g., Nuremberg Code of Ethics in Medical Research ( 1 9^*9) ; 
Declaration of Helsinki (1964) in Waltz and Inbau 379-383. 

39 . Presumably scrutiny of surgery by tissue committees and departmental 
review occurs more frequently than does review of medicine. 

40 . In patient rounds in an academic setting the justification for using 
an innovative procedure is more likely to be questioned, though even 
here the prestige of the attending physician may prevent rigorous 

cr i ticism. 

41 . One court has held that the hospital has no duty to assure that a 
physician obtain legally effective consent from the patient. 
Fiorentino v. Wagner , 227 N.E.2d 296 (N.Y. 1 967) • 

42 . For a thorough analysis and account of the moratorium as a peer 
control device, see Swayzey and Fox, "The Clinical Moratorium: A 
Case Study of Mitral Valve Surgery", in Freund, ed. Experimentation 
with Human Subjects , 3 1 5~35 1 (1970). 

43 . Swayzey and Fox, however, might find the clinical moratorium to be 
more effective than I suggest. No doubt it has been effective in 
some instances, but without further evidence it does not appear 
likely to operate in most applications of innovative therapy. 

44_. For an account of the history and functioning of PSRO's, see Note, 
Federally Imposed Self-Regulation of Medical Practice: A Critique 
of PSRO, 42 Geo. Wash. 822 (1974). 

45 . Since innovative therapy by definition will depart from PSRO standards, 
PSRO review could discourage some applications of innovative therapy. 
This will depend on the willingness of PSROS to accept a physician's 
justification for departure from accepted practice. Conceivably, the 
frequency of boundary activities will not be affected. 

46 . The use of an innovative therapy is, by definition, an intentional 
deviation from standard practice. 



16-50 



Footnotes 

47 . Criteria for distinguishing innovative therapy from other forms of 
research would also be needed if one wishes to regard all innovative 
therapy as research, and then subject innovative therapy to control 
procedures different than those applied to all other forms of research. 

48. See pp. 3^-35, infra . 

49 . While the rules for experimentation need not include I RB review, given 
the history of federal controls on experimental activities it is likely 
that public policy will require some form of I RB review. What is 
unclear is whether I RB review will be required for all research with 
human subjects or just for research funded by the government or occur- 
ring in government funded institutions. Depending on the scope of the 

I RB requirement, and the means used to impose it, constitutional 
considerations may become relevant. See note 3, supra . 

50 . This assumes that I RB ' s actually do achieve these goals, though 
empirical data verifying their efficacy does not exist. It is 
particularly unclear whether IRB's will require innovative therapies 
to be applied in rigorously controlled circumstances, thus tending 
to turn each use of an innovative therapy into a formal clinical 
trial. While an I RB could have this effect, the author's experience 
on one I RB suggests that it may be unrealistic to expect significant 
gains in this regard. 

51 • For example, the PSRO legislation was challenged in an unsuccessful 
federal suit. Assoc, of Amer. Phys . and Surgeons v. Weinberger , 
395 F.Supp. 125 (1975). 

52 . See note 4 supra . The government would face a more difficult challenge 
than it confronted in the PSRO litigation because the regulation of 
innovative therapy is not conditioned on receipt of federal funds. 

53 . See note 3 supra . 

53a - If Medicare and Medicaid funded therapy was included in this category, 
the administrative problems discussed above would occur. 

53b . The general assurances do not speak explicitly of innovative therapy, 
but rather commit the institution to adhere to the policies and pro- 
cedures contained in 45 C.F.R. 46.1 - .22. 45 C.F.R. Sec. 4613(b) 
defines subject at risk in a manner that appears to include innovative 
therapy. See note 2, supra. 



16-51 



Footnotes 



54. See Barber, Research on Human Subjects (1973); Gray, Human Subjects 
in Medical Experimentation 235~256 (1975). While both the Barber 
and Gray studies give little solace to I RB advocates, their find- 
ings may be reflect a temporary phenomenon that will pass with 
greater IRB experience and development of more effective procedures. 
The National Commission for the Protection of Subjects of Biomedical 
and Behavioral Research may generate data showing greater efficacy 
in both regards, or at least ways of increasing IRB efficacy. 

55 . Levine, Addendum to Boundaries Paper, September 25» 1975, P- 10a. 

56. Id. at 18a. 
57- Id- at 17a. 

58. Id. at 10a, 11a. 

59 . Id. at 17a. 

60 . See note 50 supra. 

61_. 45 C.F.R. Sec. 46.2(b)(1). 

62 . Levine, "Boundaries Paper", prepared for the National Commission for 
the Protection of Human Subjects of Biomedical and Behavioral Research, 
pp. 6-7, 17, July 14, 1975. 

63 . Norton, "When Does an Experimental Innovative Procedure Become an 
Accepted Procedure," Pharos Oct., 1975, 161-162. 

64 . Francis Moore, for example, defines human experimentation "as either 
the intentional employment of normal human subjects as volunteers 
for physiologic experiments, or the study of patients (in a way that 
would not directly benefit them) to gather information on a disease 
or its treatment." "Therapeutic Innovation: Ethical Boundaries 

in the Initial Clinical Trials of New Drugs and Surgical Procedures", 
Daedal us , Spring, 1969, P- 502. Similarly, a subcommittee of the IRB 
of the Center for Health Sciences of the University of Wisconsin 
recently came up with this definition: 

"any organized propsective process which seeks. to secure 
new information from humans or about humans and/or which 
differs in any way from customary or generally accepted 
medical practice." 



16-52 



Footnotes 



65. Robert Levine's definition also appears to exclude this possibility 
though elsewhere he acknowledges that such activity is research. 
See Levine, p. 10. 

66 . See Moore, op. eft., note 58, supra; Norton, op. cit., note 57, 
supra. 

67 . LaFave and Scott, Criminal Law I96 (1972). 

68. Francis Moore, for example, expresses special concern for the safety 
of patients in the first use of a new drug or procedure. Op. cit., 
note 58, supra. However, the situations he discusses appear to 
involve an experimental intent, and thus would be subject to review 
on that basis. 

Consider also the first heart transplant or use of a mechanical 
heart. A therapeutic intent in those situations cannot be 
denied, but it would be very difficult for Dr. Barnard or Dr. Cooley 
to maintain that they had no intent to gather knowledge about 
the procedure beyond the needs of the patient. 

69 . This appears to be the case currently with most instances of innovative 
therapy occurring in institutions receiving HEW funds. Few 
instances of innovative therapy are submitted for review, either 
before or after their use. 

70. On the whole this statement appears to be true, though one can easily 
imagine therapies whose innovative or non-accepted status would be 
apparent to an observer, e.g., covering a patient with newspapers 

to treat cancer. 

71 . Robert Levine appears to reach a similar conclusion when he states: 

"The definition of research provided in this paper is 
designed, in part, for the benefit of the professional 
who will wish to distinguish which of his activities 
may be viewed (by others) as research. He may be 
advised that, at some moment when he is considering 
performing some activity, he can consider whether his 
intent is in part or in whole research as contrasted 
with practice. In that case he may be advised further 
to express his intent In the form of a protocol and have 
it reviewed by an IRB. He may also be advised to conduct 
his consent negotiations with the prospective subject 
so as to make clear his Intent to that individual." 

Addendum to Boundaries Paper, 5a-6a, Sept. 2k, 1975* 



16-53 



Footnotes 



72 . If the injury results from negligence, the subject might be able 
to recover damages. However, if there is no negligence, the 
subject is left bearing the cost of the injury. 

73 . Havighurst, "Compensating Persons Injured in Human Experimentation", 
169 Science 153-157 (1970). 

Ik , For discussion of the complexities of such a decision see Calabresi, 
The Cost of Accidents (1970); Havighurst and Tancredi, "Medical 
Adversity Insurance - A No-Fault Approach to Medical Malpractice 
and Quality Assurance", 51 Health and Society 125-168 (1973). 

75. See the discussion of this point at pp. 11-1A-, supra . 



16-54 



17 



THE BOUNDARIES BETWEEN BIOMEDICAL RESEARCH INVOLVING HUMAN 

SUBJECTS AND THE ACCEPTED OR ROUTINE PRACTICE OF 

MEDICINE, WITH PARTICULAR EMPHASIS ON 

INNOVATION IN THE PRACTICE OF SURGERY 



David Sabiston, M.D. 



In the introduction of Levine's thoughtful position paper, he empha- 
sizes the fact that it is fortunate that sharp definitions between the boundaries 
of biomedical or behavorial research and accepted and routine medical prac- 
tices are not required, a fact of much importance. As one pursues this sub- 
ject, it becomes evident that there is no dividing line which can be consist- 
ently agreed upon by any group of authorities on the subject. In fact, it is 
generally recognized that such an arbitrary division is simply impossible, at 
least if determined on a rational basis. Therefore, an objective of an apprais- 
al of this subject might be the development of a series of approaches leading 
to an improved and more complete understanding of this increasingly impor- 
tant issue . 

At the outset, it can be stated that there are two parts of the spectrum 
which are definite: (1) those diagnostic and therapeutic areas in medicine 
about which the overwhelming majority of authorities would agree that the 
test or treatment is established beyond reasonable doubt. Fortunately, this 
portion of the spectrum in medical practice comprises the vast majority of the 
field today, and clearly this is true as applied to the surgical disciplines. At 
the opposite end of the spectrum are those studies which are clearly experi- 
mental and are being pursued for the acquisition of basic knowledge without 
any intent to suggest by implication or fact that the patient will immediately 
benefit. Again, the first portion of the spectrum represents a large area of 
daily endeavor and the latter a much smaller one . Between these two posi- 
tions, there is a definite "gray zone" in which it is difficult to classify objec- 
tively the diagnostic test or the therapeutic program as accepted practice 

versus experimentation . 

17-1 



One point which can be appropriately made is the fact that the role of 
the intent of a given procedure might be profitably minimized, since it is al- 
most always impossible to prove this point, certainly from a legal point of 
view. Moreover, insofar as an individual patient is concerned, it might be 
said that there is often little difference in the approach to therapy and an ex- 
periment since in modern medicine one should outline in detail the benefits 
and risks in both situations. Moreover, quality control of patient care is and 
should be monitored by peer review groups, whereas human investigation 
should be controlled by institutional panels designed to review each protocol 
with membership of the panel broadly chosen, including informed members of 
the laity. In this connection, the comments of Philip Handler, President of the 
National Academy of Sciences, bear repetition. He succinctly summarizes the 
present status of human experimentation as follows: "It is no longer possible 
for an isolated investigator to go off on his own and simply do as he pleases. 
He is now accountable to his colleagues, in advance, before he may undertake 
any proposed experiment. Indeed, that very process has increased the sophis- 
tication of current medical research." Ultimately, all relationships between 
physicians and patients rest upon a personal agreement between the two parties. 
While it is recognized that in many instances such relationships between physi- 
cians and patients have eroded by comparison with the past, it is equally impor- 
tant to stress the need for a return to this important and much to be desired re- 
lationship . 

In Dr. Levine's comments concerning "patients and subjects" and their 
relationship on the one hand to a health care professional and on the other as 

17-2 



an individual who is to be observed or experimented with by an investigator 
do represent the situation at the two ends of the spectrum, but a significant 
number of persons fall into an intermediate category difficult to define. His 
comments on the natural history of various diseases are also quite significant 
sinc3 it is such data that provide the physician and surgeon with the appro- 
priate facts to discuss with the patient, the problem, and frequently the need 
for experimentation in an effort to improve both the quality of life as well as 
its length . The thoughts expressed about fiduciary relationship of experimen- 
tal studies are also well taken. While monetary reward is often significant in 
terms of separation of therapy from pure research, such is not an adequate 
or appropriate classifying device. 

Every physician, and indeed many informed laymen, recognize that 
most of the advances in medicine have derived from what must be defined as 
"human experimentation." The surgeon generally insists first upon the per- 
formance of new operative procedures in the experimental animal with careful 
attention being given the clinical course as well as the biochemical, physio- 
logical, and pathological changes which follow. Nevertheless, when the op- 
eration is first performed on humans , by definition it must be termed an ex- 
periment , although one being done with sound preliminary knowledge. Un- 
der these circumstances, it is imperative that the patient be fully appraised 
of everything that is known and of the risks involved . Obviously , informed 
consent in the fullest meaning of the term is essential . 

It is also recognized that many medical advances have been made as 

a result of totally healthy human volunteers who have nothing to gain except 

17-3 



personal gratification, at least immediately, from the scientific information that 
might be derived from an experimental study. For example, the entire field of 
the transplantation of human organs has been greatly advanced by those healthy 
donors willing to undergo an operation for removal of one of the two normal kid- 
neys to be transplanted into a patient with life-threatening renal insufficiency . 
It is apparent that while the total risk of the operation upon the donor is low, 
nevertheless it is real and could indeed in rare instances be life-threatening. 
Despite this fact when the need arises , it is usual for a volunteer to be forth- 
coming and with full realization of the potential hazards which might occur. 

A classic example of the advantages to mankind from human experimen- 
tation is summarized in the following historic citation: "Professor Forssmann: 
As a young doctor, you had the courage to submit yourself to heart catheteri- 
zation . As a result of this, a new method was born which since that time has 
proven to be of great value . It has not only opened new roads for the study of 
the physiology and pathology of the heart and lungs, it has also given the im- 
petus for important researches on other organs." This short, yet profound, 
introduction of a historic contribution to medical science comprised the cita- 
tion to Werner Theodor Otto Forssmann when he was awarded the Nobel Prize 
in Medicine in 1956 . The interesting feature of this monumental achievement 
is the fact that as a 25 year old intern in surgery this pioneer, after repeated 
trials of cardiac catheterizations in the cadaver, introduced a catheter into his 
own arm vein and passed it into the right ventricle of his heart. Despite the 



17-4 



fact that he had approached a member of the faculty and a fellow intern to as- 
sist with the procedure, both refused to assume any responsibility for the ex- 
periment . 

In current surgical practice, it is well recognized that the majority of 
operations performed in this country are those which are widely accepted as 
standard practice with results of proven efficacy. Thus, the removal of the 
appendix for acute inflammation , removal of stones from the common bile duct 
in obstructive jaundice, the removal of most neoplasms (especially those with- 
out evidence of metastases) , and the surgical drainage of purulent abscesses 
are typical examples. However, many procedures might appropriately be clas- 
sified in an intermediate category including operations such as intestinal by- 
pass operations for control of obesity and for hyperlipidemias . 

In the recent past, much emphasis has been given the subject of revas- 
cularization of the heart for myocardial ischemia (coronary arterial bypass 
procedures) . While it is clear that the non-operative management of angina 
pectoris and its complications is often effective, nevertheless in many in- 
stances, this form of therapy leaves much to be desired. The development 
in the past decade of the coronary bypass procedures has led to the wide- 
spread adoption of this technique with an estimated 50,000 or more of these 
operations being done annually in the United States. Nevertheless, justifi- 
able controversy continues concerning the indications for such therapy and 
indeed of the long-term results . On the basis of the data available, it is gen- 
erally accepted that the relief of pain is achieved in approximately two-thirds 



17-5 



of the patients and an additional 15 to 20 percent receive partial relief of an- 
ginal discomfort . One of the most desired results of this operation is the pro- 
longation of life, and upon this point there is conflicting evidence. However, 
at this point in time the preponderant view supported by accumulated statis- 
tics indicates that the operation does not extend the length of life when com- 
pared with appropriate controls managed medically. For example, the Veter- 
ans Administration Hospital system has recently completed a five year random- 
ized study of a series of patients with documented angina pectoris due to sig- 
nificant atherosclerotic obstructing lesions in the coronary arteries . All pa- 
tients were reviewed by a cardiological and surgical panel in the cooperating 
centers, and it was agreed that each was an appropriate candidate for surgi- 
cal treatment by contemporary criteria. The plan for the randomized study 
was carefully reviewed with each patient and explained in appropriate detail. 
Following this, an envelope was opened which committed the patient either to 
medical or surgical therapy. Thus, among the patients in the study, half 
were operated upon with the performance of a bypass graft and the remaining 
half were managed by customary medical (non-operative) therapy . The inves- 
tigators chose not to study the relief of anginal pain in these patients, but rath- 
er directed their interest toward longevity . It was interesting that the life ex- 
pectancy of these patients was the same in each group, with the exception that 
those patients who had significant stenosis of the left main coronary artery had 
an improved life expectancy following surgery . (In most series , obstruction 
of the left main coronary artery comprises approximately 10 percent of the total 



17-6 



patients undergoing coronary arteriography for angina pectoris.) Thus, while 
this operation is widely employed, attention should be directed toward the known 
facts concerning the benefits which can reasonably and objectively be expected 
from the procedure . 

Every surgical procedure is in a sense an experiment, since one cannot 
predict with accuracy the development of postoperative complications which may 
ensue, as for example the appearance of a wound infection. In fact, in his orig- 
inal report of the cardiac catheterization upon himself, Forssmann mentioned that 
he developed a wound infection in the self-made incision . 

Thus, from a surgical point of view, innovations are being made daily as 
an individual surgeon finds improved results with specific changes in operative 
technique. While these may be minor, it should be noted that they often arise in 
specific situations not previously encountered and call for a decison to be made 
immediately in order to prevent a perilous outcome . Since the patient is anesthe- 
tized and usually cannot be safely awakened, total informed consent is not pos- 
sible. An example of this type is the pioneering contribution of Dr. Bertram M. 
Bernheim. A student of the noted surgeon, William S. Halsted, in 1915 Bern- 
heim operated upon a patient with a painful and expanding aneurysm of the pop- 
liteal artery which threatened to rupture. Prior to operation, he had demon- 
strated that temporary occlusion of the femoral artery above the aneurysm pro- 
duced clinical signs of ischemia in the leg distally. Therefore, he knew in ad- 
vance that it would be necessary to leave a portion of the aneurysm to allow con- 
tinuity of blood flow from the femoral artery above into the popliteal artery below 



17-7 



otherwise gangrene of a portion of the leg would ensue. However, at opera- 
tion the aneurysm was so thin-walled and the tissues of such poor quality that 
none of it was available for restoration of continuity of the artery above with 
that below. Therefore, rather than simply ligating the two ends of the arteries, 
which were quite far apart and not available for direct anastomosis, he removed 
a segment of saphenous vein and used it as a substitute. Dr. Halsted, in com- 
menting upon this pioneering achievement, called it the "ideal operation for the 
treatment of a popliteal aneurysm." However, this was not predictable before- 
hand but represented a reasonable alternative to what otherwise would have 
been a disastrous result, that is, amputation of a limb. Obviously, Dr. Bern- 
heim was willing to assume the responsibility for his action, and it is clearly an 
example of appropriate judgment and action in an admittedly difficult situation. 

Summary 
In the consideration of boundaries between biomedical or behavorial re- 
search and the accepted routine practice of medicine, it is apparent that while 
the establishing of such distinctions is desirable, it is nevertheless extraordi- 
narily difficult. In the surgical sciences, innovative changes are both essen- 
tial and desirable in daily practice. Moreover, in the clinical setting of sur- 
gery, it is not always possible to predict the situation which will be encount- 
ered and therefore to have the opportunity to provide total informed consent. 
Nevertheless, the key feature of both modern therapy and research is based 
upon a detailed and frank exchange between the physician or investigator and 
the patient. While it is important to define the intent, from a legal point of view 



17-8 



such is exceedingly difficult to prove. In the vast majority of instances, the 
most appropriate means of monitoring quality control in medicine is by the 
peer review mechanism, whereas monitoring of human investigation is best 
achieved by review panels broadly composed to specifically evaluate and de- 
cide upon each protocol proposed. Clearly, human investigation in the sur- 
gical disciplines, as well as in all of medicine, is essential if the advances 
characteristic of the past several decades are to continue. 






17-9 



18 

WHAT PROBLEMS ARE RAISED WHEN THE CURRENT DHEW REGULATION 

ON PROTECTION OF HUMAN SUBJECTS IS APPLIED TO SOCIAL 

SCIENCE RESEARCH? 

Richard A. Tropp 



What Problems Are Raised When the Current DHEW 
Regulation on Protection of Human Subjects 
Is Applied to Social Science Research? 



Richard A. Tropp 
Formerly Office of the Secretary, DHEW 



Question Presented 

What amendments, if any, should be made in the current DHEW regulation 
on "Protection of Human Subjects" (hereinafter, "Part 46") in order to 
facilitate the application of the regulation to social science research? 
What issues and problems are raised by application of Part 46 as it 
stands to such research? 

It is assumed, for purposes of this analysis, that the expression 
"social science research" includes behavioral research conducted 
outside of the clinical psychological setting. It is unnecessary for 
purposes of the analysis, and for drafting possible amendments to 
Part 46, to reach the issue of where "social science research" is 
discontinuous with "behavioral research" — although it is precisely 
this thorny boundary question which has been the focus of the greatest 
wrangling between the agencies within DHEW which have been discussing 
possible amendments to the regulation. 

Background 

Under the gun of imminent passage by Congress of the National Research 
Act, the Secretary of DHEW on May 22, 1974 signed a regulation on 
"Protection of Human Subjects" for Federal Register publication on 
May 30. The regulation was the product of an extended drafting process 
by NIH staff, assisted by DHEW General Counsel staff assigned to, and 
housed within, NIH. The Department's other line agencies — the Office 
of Human Development, the Social and Rehabilitation Service, the Office 
of Education, and the National Institute of Education, inter alia — were 
not involved in that drafting process; the staff offices within the Office 
of the Secretary were not involved until very late in the game. 

Consequently, the regulation came as a great surprise to the rest of 
the Department, which was collectively taken unaware not only by the 
applicability of Part 46 to all Department activities, but also at finding 
out that the Guidelines preceding Part 46 had, on their face, applied 
to the other agencies all along. At the time Part 46 was published, 
substantial differences had arisen within the Department — and, under 
the deadline pressure, had not. been resolved — on the applicability of 
the regulation to non-biomedical research and to demonstration and 
service delivery programs. 

Notwithstanding the absence of consensus within the Department, the 
regulation was published in order to meet the perceived needs of the 
Congressional conference committee then considering the National Research 
Act (now P. L. 93-348). It was understood within the Department — and 

18-1 



alluded to in the preamble to the regulation — that discussion and 
negotiation would proceed among the agencies and the OS staff offices 
in order to construct a regulation appropriate to social science research 
and to operating programs. It was intended by the parties involved in 
the decision to publish the regulation, for example, that income mainte- 
nance and health services financing experiments not be constrained by 
a regulation written with biomedical research as its conceptual framework. 

Extended discussion among the affected organizations within DHEW has 
made it clear that the agencies generally are responding to the regulation 
by ignoring it, as they did the Guideline which was its predecessor. The 
discussion has, however, begun to educate policy-level agency staffs on 
their responsibilities under the regulation, and has generated reflection 
on how the regulation might be optimally structured so as to protect 
subjects involved in non-health-services research. There has been some 
clarification of precisely what questions Part 46 raises, and whose 
interests each question affects. 

This analysis will identify those major questions, and will suggest 
some alternative remedies available to the Commission if it should 
choose to consider amending Part 46 in order to maximize its applicability 
to all Department research. 

1. Explicit Coverage of Social Science Research 

Although social science research is implicitly covered by the Applicability 
section of Part 46, the history of the regulation has caused many, if 
not most, grantees and contractors to assume that only biomedical and 
clinical psychological research funded by the agencies within the "H" 
part of DHEW is covered. Other agencies within the Department see Part 
46 as being ambiguous on whether human subjects at risk arising from 
social science research are protected. 

The language of the informed consent requirement, which seems to many 
grantees and contractors to be particularly tailored to biomedical 
research, reinforces their predilection — and that of agency staff 
outside the "H" organizations — to assume that the regulation simply 
does not apply to them. 

In order to send a clear signal to grantees and contractors, and to 
all agencies of the Department, that all DHEW-funded research is to 
be covered by Part 46, perhaps the regulation should specify that its 
scope of coverage incorporates social science research. Alternatively, 
perhaps the preamble to the regulation ought to specify that the 
ambiguous Congressional language "behavioral research" should be 
construed to encompass all non-biomedical research funded by DHEW. 

2. Coverage of Intramural Research 

For most of its history, Part 46 has not covered human subjects involved 
in research conducted by employees of the Department (intramural research) , 
only research conducted outside DHEW under grants and contracts (extramural 
research). NIH has long protected subjects of its own intramural research, 
but no other agency of the Department has had its own procedures to 

18-2 



regulate intramural social science research and behavioral research 
conducted outside of a clinical psychological setting. 

In August 19 75, as an afterthought to the regulation on fetal research, 
a new subpart was added to Part 46 in order to achieve the end of regu- 
lating all DHEW intramural research. That new subpart tries to say that 
the substantive standards which Part 46 applies to extramural research 
will hereinafter apply to all DHEW intramural research as well, but 
that each agency may — emulating NIH — set up its own internal procedures 
to enforce the application of those substantive standards. The intent 
was to permit "H" to retain its current internal procedures, while 
compelling the other agencies to establish procedures which they 
presently lack. 

Assuming that this approach is the optimal one, the new intramural 
research subpart is at best unclear on just precisely what it is that 
the agencies have to do. Since it is not incorporated into the main 
body of the regulation, it is generally unknown within DHEW. At the 
minimum, it would seem useful for the substance of the new subpart to 
be transferred to the Applicability section of Part 46, and for it to 
be rewritten so as to be specific in its guidance to agency heads on 
what it is that they have to do tomorrow as a consequence of this new 
wrinkle in the regulation. 

It may be, however, that the approach of many different agency procedures 
is not the optimal one, on the ground that it is neither seemly, nor 
consistent with the intent of independent review of research proposals, 
for employees of an agency to review assurances of compliance from 
other employees of the same agency. 

Under the section Submission of Assurances (846.4 of Part 46), assurances 
of compliance with the regulation must be filed by grantees and contrac- 
tors with the Department, and must be approved as consistent with Part 
46 prior to funding of the research. Perhaps that section should be 
amended to require that when agency staffs propose to conduct intramural 
research, assurances of compliance must be filed with, and reviewed by, 
one of the staff offices within the Office of the Secretary or, alternatively, 
a board of outside advisors to the Secretary. Research involving risk 
of physical injury, and research conducted in a clinical psychological 
setting, could remain within the bailiwick of H's intramural review 
procedures. 

Establishing a procedure within OS to review agency research for compliance 
with the regulation, and requiring that intramural research must receive 
OS compliance approval, would maximize uniformity across the Department 
of protection of subjects involved in behavioral and social science research. 
A body of administrative case law could be established to which agencies 
would turn for guidance. An OS staff office procedure, or an outside 
board, would be of assistance to an agency head caught in cross-pressures 
on whether he should authorize an ethically dubious intramural project. 

It would be useful for the Commission to examine (i) whether it is satisfied 
with the current approach of many different agency internal procedures 
enforcing one uniform substantive standard; (ii) if so, whether it is 
satisfied with the extent to which the new subpart clarifies for agency 

18-3 



heads what is to be construed as "procedural" (and therefore subject to 
variance) and what as "substantive" (and therefore not subject to discre- 
tionary implementation by an agency head), and whether the language is, 
generally, sufficient guidance to agency heads and research staff; and 
(iii) if not, what alternative, possibly including OS staff or advisory 
board review, would be most likely to ensure substantive compliance with 
Part 46 by Department employees who conduct intramural research. 

3. Protection of Individuals at Risk Who Are Not "Subjects" of Research 

In social science and non-clinical behavioral research, persons may be 
placed at risk of harm even though the research does not generate data 
about their behavior, and is not intended to intervene in their lives. 
The researcher never encounters them in the course of administering his 
research project, but he may be unable to prevent external diseconomies 
which accrue to them from his experimental intervention or from the data 
collection process. For example, 

(i) Apartment rents may be driven up in neighborhoods which house 
a threshold mass of housing allowance experiment subjects. The 
effects of the price rise will be felt by nonparticipant neigh- 
bors of the subjects, and by those who seek to move into the 
neighborhood. 

(ii) Labor supply prices may be driven either up (if subjects opt 
out of the labor market) or down (if subjects remain in the 
labor market, but become willing to take much lower-paying jobs 
as long as they also obtain an income supplement with an acceptably 
low marginal tax rate on earnings) in the labor market which 
contains a threshold mass of income maintenance experiment 
subjects. Depending upon which way prices go, either nonpar- 
ticipant employers or nonparticipant competing employees will 
be financially harmed. 

(iii) A police deployment or patrol pattern experiment may transfer 
some kinds of crime from one neighborhood to another, thereby 
benefiting some nonparticipant individuals and harming others. 

(iv) A health insurance experiment may increase the price, and decrease 
the supply, of some scarce medical resources in a particular area. 
At the extreme, a nonparticipant individual may die as a consequence 
of being priced out of the market for a scarce life-saving resource, 
which goes instead to an experimental subject whose purchase of 
the resource is subsidized by the research. 

The current regulation does not extend its protections to anyone who is 
not directly a subject of research. If the regulation is to be appli- 
cable to all behavioral and social science research, arguably the definition 
of "subject at risk" (§46. 3) should be amended in order to create a new 
class of persons at risk who are protected even though the researcher does 
not perceive or treat them as subjects. There is no such class in the 
current regulation because the definition of "subject at risk" was drafted 
within the conceptual framework of a biomedical research model. 

18-4 



Sane DHEW attornies have argued that nonparticipants at risk arising from 
social science research should not be protected by Part 46, or should not 
be as rigorously protected, since the Department owes them no duty under 
current law. Case law has, in contrast, established clear responsibilities 
by the biomedical researcher toward his subject. Were Part 46 to be amended 
to extend those responsibilities to the nonparticipant at risk, DHEW would 
open itself, and its research contractors and grantees, to novel legal 
liability. 

It is quite true that the case law of informed consent has thus far been 
limited to factual contexts involving face-to-face contact between a 
biomedical researcher and his subject. It does not follow from that, 
however, that the courts will find a nonparticipant at risk to have no 
claim. The matter has simply not risen to judicial attention. It may 
readily be argued that a court will soon find a plaintiff nonparticipant 
at risk to be, with respect to social science research, in the same 
position as the subject of biomedical research, and therefore to be 
entitled to protections analogous to those of Part 46. 

Even assuming that judicial remedy would be restricted to subjects 
who have chosen to participate in research, so what? The limitations 
of current law need not constrain either the Secretary or the Commission 
in par sling out what kinds of protections are ethically — if not legally — 
owed to nonparticipants at risk arising from research funded by DHEW. 
The current regulation, in fact, offers protections to subjects which 
exceed the protections upon which the judiciary has reached consensus. 
The Commission can recommend, and the Secretary can make, new law. 

It has also been argued that creation of a new class of administratively 
protected nonparticipants at risk would be detrimental to some biomedical 
research, since family members and friends of subjects could claim harm 
solely by virtue of their relationship with a subject who is actually 
at risk of harm arising from his participation in an experiment. 
Assuming that it is undesirable to compel biomedical and behavioral 
researchers to seek the informed consent of family members and friends 
who may be at risk solely because of their contact with a research subject, 
the problem can be avoided by incorporating into the regulation a new 
definition of "physical injury" and, perhaps, of "psychological injury". 
Che definition could specify that injury cannot be_claimed, for purpoae.3 
of invoking the protections of Part 46, solely by virture of a person's 
family or other relationship with a research subject. 

Were that definition written into Part 46, creation of a new legally 
protected class of nonparticipants at risk would not constrain biomedi- 
cal research. It would, however, protect nonparticipants unwittingly 
at risk arising from social science research. 

4. Should Participants in National Demonstration Programs and Service 
Delivery Programs be Covered by the Regulation? 

Part 46 presently extends its protections to participants in all "research, 
development and related activities" funded by DHEW. "Development and 
related activities" is undefined, and may be construed to cover non-biomedical 
demonstrations and service delivery programs. Were the agencies to take 
the regulation language seriously, a number of interesting problems would 
follow: 18 _ 5 



(i) National demonstration programs such as Head Start and 
youth services systems would be required to have each 
grantee create an institutional review board. In the 
politically supercharged comnunity environment within 
which the grantees function, the constitution of such 
a board — and its power to constrain a program director — 
might well become political footballs tossed between 
community groups struggling for legitimacy and power. 
That is a cost arguably worth incurring when there is 
more than minimal risk to a child, but is it still worth 
it when the IRB is to be established — and consent sought 
from every parent — simply because Head Start and youth ser- 
vice systems depart from the established and accepted 
methods of reaching children? 

In the eyes of managers of these and a number of other 
non-biomedical national demonstration programs, the 
prospect of creating an IRB and seeking consent from every 
participant's guardian is an explosive, and unnecessary 
nightmare . 

(ii) On its face, the regulation would also require consent 

and IRBs of every grantee who conducts a service delivery 
program which departs from established and accepted methods 
of meeting participants' needs — even though the risk is 
marginal, and even though the program is not perceived by 
DHEW as either an experiment or a demonstration. Community 
mental health centers would be required to conform to Part 
46, for instance, as would schools which receive compensatory 
education funds under Title I of the Elementary and Secondary 
Education Act. 

Conformity with Part 46 by these kinds of programs raises, 
on a national scale exceeding that of demonstration programs, 
the prospect of widespread corrmunity infighting triggered by 
allegations of marginal risk. 

Although the non-"H" agencies have striven to avoid applying Part 46 
to national demonstrations and to service delivery, it seems inescapable 
from the face of the regulation language that they will have to begin 
doing so. If the Commission and the Secretary deem that to be a desi- 
rable outcome, it would be helpful to agency managers if Part 46 were 
amended to make it explicitly clear that the intent is to include all 
EHEW grantees and contractors, not only those engaged in research and 
development . 

Alternatively, perhaps the regulation should be amended to specifically 
exclude from its protections persons receiving benefits from national 
demonstrations and from service delivery programs, save for biomedical 
national demonstrations which — like clinical trials or HMDs — may involve 
risk of physical harm to participants. 



18-6 



5. Should the Regulation Protect Subjects and Others Against Injury 
Suffered by Them in Their Capacity as Members of a Group? 

Part 46 protects a subject at risk of "psychological injury" or "social 
injury", without defining those expressions. Absent a definition of 
"psychological injury", someone may claim risk of injury if the interests 
of his racial, ethnic, religious, economic, or community group seem to 
conflict with a particular research project — even if there is no other 
risk of harm to the individual separate from the alleged harm to his 
group. Moreover, someone may claim risk solely because he is a relative 
or friend of someone who has actually been injured (has become depressed, 
for instance, or has lost self-esteem) by research. 

With "psychological injury" already a component of the definition of 
risk, the additional expression "social injury" opens a Pandora's box 
of allegations of injury to an individual in his capacity as member of 
a group or community. If the only risk alleged with respect to a 
particular research project is injury to a group or community, a large 
dose of political hoopla will doubtless accompany the establishment of 
an 1KB and the submission of a general or special assurance under the 
regulation . 

Given the inevitable political conflict, the question is whether alle- 
gations of group or collateral psychological injury should be sufficient 
to trigger the protections of the regulation, absent a separately iden- 
tifiable risk of individual injury. If not, the expression "social 
injury" should be stricken from the regulation, and a new definition 
of "psychological injury" should be added to Part 46, specifying 
that risk of such injury refers only to that injury which a person may 
suffer in his individual capacity, and not merely in his capacity as 
a relative or friend of a research subject, or as member of a group or 
community. 

6. Should Risk of Financial Injury be Covered? 

Part 46, drafted within a biomedical conceptual framework, contains 
no reference to risk of financial injury. The regulation consequently 
fails to protect persons participating in income maintenance, health 
insurance, and other social science research funded by most of the agencies 
in DHEW. 

Assuming that Part 46 is to protect persons at risk in all research 
conducted or supported by HEW, risk of loss of present or anticipated 
assets or income ought to be incorporated into the definition of risk. 

7. Risks Arising from Publication or Policy Application of Research Results — ' 

Social science research is sometimes met with interest group or community 
protests on the ground that publication of a research conclusion (cf . 
Arthur Jensen's research), or government policy changes based on the 
research results (cf . the income maintenance experiments , particularly 
in Gary , Indiana) , will be harmful to the group or community as a whole , 
although specific risks to specific individuals cannot be identified. 

The regulation is silent on whether such alleged risk triggers its 

18-7 



protections, but a number of grantees and contractors have run up 
against the question. Where it has arisen, it has been highly 
politicized. 

If indeed we do want such risks explained to subjects (in, for instance, 
educational performance research which will compare ethnic or economic 
group performance on IQ or achievement tests) , and considered by IRBs , 
then that intent should be made explicit in the definition of risk. 
If not, it would be helpful to those conducting field social science 
research if language were added to the definition of risk providing 
that, except as research results pertain to a named or identifiable 
person, "risks arising from publication or policy application of 
research results" will not be deemed sufficient to trigger invocation 
of the protections of Part 46. 

The exception for research results pertaining to a named or identi- 
fiable person will protect the subject of biomedical or clinical 
psychological research whose case history has been taken, and whose 
privacy would be invaded by publication of material from that case 
history. 

8. Must All Research Procedures, and the Purpose of Research, be 
Explained to the Subject? 

Part 46 presently requires that all research procedures be explained, 
in all types of research, regardless of whether particular procedures 
do or do not cause a subject to be at risk. It is also required, as 
part of the informed consent process (§46. 3(c)), that purposes be 
fully explained to the subject, regardless of whether particular 
purposes are material to his determination of risk to him. 

DHEW's Guidelines until 1974 did not specify that purpose be disclosed, 
and the American Medical Association's principles still do not. Disclo- 
sure of purpose is, however, required in the Nuremberg Code the Decla- 
ration of Helsinki, and the World Medical Association Code .£/ Several 
of the participants in the recent Brookings conference on social experi- 
mentation went out of their way to suggest that "There should be no 
ethical responsibility to inform subjects in analytical detail about 
the intent of the research, "3/ and 

(i) "/T/o disclose the purpose of the research may jeopardize 
the scientific validity of the results. This is certainly 
true in social science research since it is concerned with 
the behavior of subjects. . . .This behavior may be influenced 
not only by the pure treatment, but by. . .the subject's 
perception of the experimenter's expectations. To tell 
a subject in a health insurance experiment that you will 
be interested in how he utilizes medical services may well 
bias his response, particularly if the explanation is 
followed by frequent questions about health. "4/ 

(ii) "The most appropriate course /for the researcher, in 
obtaining informed consent from a subject/ seems to be 
to emphasize the important facts that will influence 
their decisions to participate "5/ 

18-8 



(iii) " /E /xper imenter s have no moral obligation to give subjects 
more information than they need to act in their long-run 
best interests, particularly if there is a risk that subjects 
might respond differently. . . ."6/ 

(iv) "The only thing he /the researcher/ can do is.. .give the 
subjects all information relevant to their own decision 
to participate."^/ 

The problem is that explanation of research purpose, and of some 
research procedures, will skew research results in many types of 
behavioral and social science research, because the subject's beha- 
vior will be affected by his acquisition of the knowledge. Whether 
or not a subject takes a job while he is receiving benefits under an 
income maintenance experiment, for instance, may well be affected by 
his knowledge that the major purpose of the experiment is precisely 
to discover whether or not the income supplement affects his labor 
market decision. 

What the Brookings conference participants generally argue is that 
research purpose and procedures should be disclosed only insofar 
as the information is material to the subject's decision process as 
to whether or not he will participate in an experiment , and on what 
terms. An alternative formulation is to require explanation only of 
those research procedures which may cause an individual to be at 
risk, including identification of any procedures which are 
experimental. If only information material to the calculation of 
risk is disclosed, perhaps research purpose may be omitted most 
of the time in securing informed consent. 

Whether the Commission elects to adopt the Brookings conference 

consensus (explain what is material to the subject's decision), 

the risk test (explain only what is material to determination 

of risk; omit explanation of purpose entirely if it is not) , or 

a third alternative, this is an issue which badly needs examination. 

As currently drafted, the language of the regulation's definition of 

informed consent is inappropriate to non-biomedical research. 

It erects for behavioral and social science research a disclosure 

requirement which goes far beyond what is necessary to enable a 

subject to make rational choices in the informed consent process, 

and it does so at the cost of skewing research results. 

Practically, what seems to be happening now is that DHEW agencies, 
including those agencies within "H" which conduct and support 
behavioral research, simply ignore this requirement, or effectively 
waive it through an inappropriate use of the regulation's modification 
clause (§46. 10(c)). The seemingly stringent requirement for complete 
disclosure of procedures and purposes has the effect, in the real world 
of research, of protecting subjects much less than a moderated, enforceable 
requirement would. 



18-9 



9. Must Benefits Expected from the Research, and Alternative 
Procedures, be Explained to the Subject in Social Science Research? 

Part 46, within the framework of the biomedical model, currently requires 
explanation to the subject of benefits which he may expect from the 
research, and of "appropriate alternative procedures that might be 
advantageous to the subject". 

Explanation of benefits, like explanation of research purposes and 
of some research procedures, may skew social science research results 
by affecting the subject's behavior, particularly if the subject is in 
a control group and understands the difference between the benefits 
which he is receiving and those which accrue to members of an experimen- 
tal group. 

In biomedical research, there may be standard and accepted procedures 
which are real alternatives for a subject in research. In social 
science research, no such beneficial alternatives usually exist, while 
an infinity of benefit permutations (how much money and what kinds of 
services we provide in an income maintenance experiment, for instance) 
may be available. Explanation of all possible benefit packages would 
burden the researcher to no gain by the subject, and may cost the 
researcher loss of subjects. 

Perhaps the informed consent definition should be amended to provide 
that all benefits and alternative procedures need be explained only, 
as in biomedical and some behavioral research, when a standard and 
accepted therapeutic option is available. The same requirement could 
be maintained for those types of research to which it is material, 
while a needless burden would be removed from social science researchers. 

Alternatively, perhaps benefits and alternative procedures should 
be explained whenever a standard and accepted option is available 
(when, for instance, the subject in a housing allowance experiment 
could obtain a higher subsidy from another program, were he to 
withdraw from the experiment) , irrespective of whether the option 
is "therapeutic" within the biomedical and clinical psychological 
models . 

10. Should Possible Breach of Confidentiality of Data Collected in 
Survey Research be Considered a Risk Which Triggers the Protections 
of This Regulation? 

Survey research raises most acutely a problem inherent in all data 
collection: is breach of confidentiality of the data collected to 
be considered a risk which triggers invocation of Part 46? The current 
regulation is silent on the issue, permitting the inference that 
breach of confidentiality may be construed as an"attendant discomfort 
or risk reasonably to be expected" (846.3(c)). It follows, if the 
inference is made, that the survey researcher must, before he begins 
to ask his questions, describe in detail the various ways in which 
respondent confidentiality may be breached, and obtain the respondent's 
formal informed consent. 

18-10 



If the research investigator has to proffer a lengthy explanation of 
the risk and obtain a consent form, the probability is high that he 
will lose many of his chosen respondents, thus making it difficult or 
impossible for him properly to randomize or stratify his sample. Some 
or many of those whom he does not lose will prove less than frank in 
their answers, destroying the utility of his data. 

Breach of confidentiality under judicial or other governmental subpoena 
definitely is a risk, as David Kershaw recounts in the Brookings con- 
ference in noting that a grand jury, at least two welfare departments, 
the General Accounting Office, and the Senate Finance Conmittee attempted 
to secure confidential data from the New Jersey income maintenance 
experiment (mostly in order to track down fraudulent welfare recipients) ._' 
There is, moreover, the simple danger that gossip by survey research 
employees engaged in data collection or analysis will harm a respondent. 

The effect of rigorous imposition of the informed consent requirement 
in survey research can, on the other hand, destroy the utility of 
the research design and instruments : 

"In short, informed consent procedures are going to make social 
research inaccurate. The amount of error is unknown, and will 
remain forever undeterminable. . . .The study clearly demonstrates 
that the inclusion of informed consent procedures in some types 
of social science /survey/ research will lead to serious loss of 
data and /to/ response bias in some circumstances . "9/ 

In order to minimize the effects of data loss and response bias, moreover, 

it is— as Donald Campbell has noted!!!/ —essential for data to remain available 

for sample reinterview. This is particularly true when surveys are 

focused on service delivery by states and units of local government, 

and when there is a need for Federal auditing of the data in order to 

ensure that services have actually have been delivered as reported. 

Data verification, whether for these purposes or simply to check 

interviewer honesty and competence (Campbell ' s concern) , imposes 

additional risks of breach of confidentiality which, if explained to 

the respondent, will induce further respondent loss and response bias. 

One way to handle the problem may be to amend the definition of informed 
consent, in Part 46, to provide that if the survey research investigator 
has established measures to ensure confidentiality of collected data, 
and if he has tersely informed the respondent that the risk of breach 
exists and that the measures exist, the risk of breach of confidentiality 
will not be considered an "attendant discomfort or risk reasonably to 
be expected", and will therefore not trigger the protections of the 
regulation. What would be required of the survey researcher is that 
steps be taken to actually protect confidentiality, and that the sub- 
ject be informed that such steps have been taken. 

The effect of such an amendment would be, assuming that the researcher 
met the prerequisite conditions, to specify that the researcher need not 
explain in detail what each of the risks of breach are, and need not 
obtain formal informed consent as a prerequisite to asking survey questions. 

18-11 



Alternatively, the Cotrmission may wish to make such an amendment appli- 
cable to all social science research, or all research funded by DHEW, 
not merely survey research. 

Whatever the resolution of the problem, there is a need for it to be 
addressed. Abundant feedback from the survey research community indi- 
cates that it is confused as to its responsibilities under Part 46, and 
that it is generally reacting to that confusion by ignoring the 
regulation. Whatever the treatment of the confidentiality problem in 
survey research is to be, there should be language specifically addressed 
to it in the definition of informed consent or, alternatively, in the 
definition of risk. 

11. Should Waiver of the Informed Consent Requirement be Permitted 
Under Exceptional Conditions in Social Science Research? 

The present regulation provides (846.10(c) , Documentation of Informed 
Consent) that there may be modification of the form of documentation 
that informed consent has been given by subjects in a particular research 
project. Reports from "H" staff supervising behavioral research, other 
DHEW agency staffs, and grantees and contractors indicate that this 
"modification" clause is frequently used to effect a waiver of some 
of the elements of informed consent. This has been done when it has 
appeared that a particular research project could not proceed if the 
whole informed consent procedure were to be implemented — if, for 
instance, all procedures employed in the research were explained to 
subjects whose behavioral responses were to be measured by the research. 

It is clear that the modification section needs tightening up to ensure 
that it cannot be used as an invisible justification for abdication of 
some elements of the informed consent requirement. 

Widespread use of the modification clause to avoid some of the substan- 
tive protections of the regulation, however, does suggest that there may 
be circumstances in which the Secretary, the funding agency, or an outside 
advisory board should be empowered, pursuant to strictly drawn criteria, 
to waive some of the elements of informed consent for particular research 
projects. For example, 

(i) What if, as in a housing allowance or a police patrol 
experiment, it is impossible to identify all of the 
nonparticipants at risk arising from the experiment? 
Alternatively, what if they can be identified only at 
prohibitive cost? 

(ii) What if they can be identified, but it is impossible to 

obtain consent at reasonable expense from a large non-subject 
population at risk, with whom the researcher would not ordi- 
narily establish contact in the course of the research? 

(iii) What if, as in unobtrusive measures research, there is a 
research design need to prevent individuals from knowing 
that the research is being conducted, in order to avoid 
skewing of otherwise natural behaviors which the researcher 
seeks to observe? 

18-12 



In social science research in which such circumstances are present, 
and perhaps in other circumstances as well, we may want to empower 
the Secretary or another party to waive some of the elements of the 
informed consent requirement, provided that : 

(i) The waiver would apply only to nonparticipant persons at 

risk arising from the research in question, not to subjects 
who are identifiable ex ante and from whom data is collected. 
In a housing allowance experiment, for instance, waiver 
might be granted with respect to neighbors whose rents may 
be affected by the experiment, but not with respect to subjects 
who actually receive the allowance. 

(ii) Waiver would be granted only upon a showing that it is 

"demonstrably infeasible" to obtain informed consent from 
a specified nonparticipant population, on the ground that 
one of a number of narrowly specified triggering conditions 
exists. The regulation could specify that the expression 
"demonstrably infeasible" (or some analogue) be strictly 
construed, and that the criteria — the conditions precedent — 
be very strictly construed. It could be specified that 
the intent of the strict construction is that waiver be 
infrequently approved. 

(iii) Waiver would be granted only under the condition that the 
information withheld be given, where the persons at risk 
are identifiable, to the affected persons in a debriefing 
after the research procedure has been completed. 

(iv) Waiver would be granted only under the condition that the 
research investigator attest in writing that the risk to 
nonparticipants reasonably to be expected from the research 
is deemed insubstantial in probability and in magnitude. 

In the event of waiver, and if the nonparticipants at risk reside prin- 
cipally within a particular unit of local government or, alternatively, 
within a single state, perhaps the regulation should require surrogate 
consent by an official of the local or state government. This proxy 
for individual agreement would be intended to provide local control 
over the acceptability of risk to non-subject persons, and to maximize 
the willing participation of the community affected by the research. 

Given the realities of local government, the probability is that 
members of the community disinclined to have their local government 
consent to an experiment will be able to have their way, even though 
their numbers be few — simply because they will care much more about 
the research than those community members inclined to permit proxy 
consent to be given to a particular research project. Those who 
care most intensely about an issue are generally able, absent similar 
intensity of feeling on the other side of the issue, to prevail at the 
local government level. 

11/ 
Several of the Brookings conferees indicated their enthusiasm xor 
surrogate local government consent as a means for protection of 

18-13 



non-subject populations at risk from research which affects an entire 
comnunity, labor market, or coirmodity (such as housing, or hospital 
services) market. There were some caveats, however: 

"For large-scale social experiments ... it is unlikely that any 
group with a prior definition will ever be quite unanimous in 
its consent — or unanimous without what some commentators have 
been calling 'undue inducements. 1 (Or, as probably happens, 
a majority coerces the minority to shut up and sign up, or 
a minority coerces the majority to do so.)" 12/ 

"This extension of the consent principle /proxy consent by 
elected representatives of affected nonpar ticipants/ may not 
always have the intended effect. When representatives of the 
Department of Housing and Urban Development took their proposal 
for a housing allowance supply experiment before the city 
council of Green Bay, Wisconsin, and carefully explained that 
local house prices might increase as a result, the council's 
imnediate response was eagerlv to calculate the implicit rise 
in property tax revenues." 13/ 

Surrogate consent will be pernicious and arbitrary in its effect 
upon nonparticipant subjects when the interests of politicians 
making the consent decision diverge from the interests of the 
affected population, and when the researcher can offer inducements 
to the politicians to make a decision unrelated to their constituent 
interests. In Green Bay, for instance, it appears that the city 
council perceived a way to raise taxes without incurring the political 
costs to themselves, and weighed that personal interest above the inte- 
rests of their constituents at risk. 

If a surrogate consent provision should be added to the regulation, 
it should be specified that all of the elements of information which 
must be presented to a subject in order to gain individual informed 
consent must be presented in writing to the local or state official 
who executes the affidavit of surrogate consent. Although the proce- 
dure be different, and the giver of informed consent not the person 
affected, the substantive elements of consent should remain. 

It seems clear that, absent addition of a waiver provision to Part 46, 
either it will be impossible to perform some social science research 
within the constraints of the regulation, or the modification clause 
will continue to be used as an invisible, unregulated, unarticulated 
waiver. Addition of a waiver would, assuming the latter prognosis, 
actually increase the protections available to nonparticipants at risk 
in social science research funded by DHEW. 

Surrogate consent of some form will, assuming the availability of 
waiver or modification, maximize the protection available to 
non-participants with whom the researcher does not come into contact. 



18-14 



14/ 
12. Compensation of Subjects; Restoration of Status Quo Ante 

The regulation is silent on whether, and under what circumstances, the 
researcher or the Department has the responsibility to compensate a 
subject. For example, 

(i) Should the Department sponsor, and should each research 
investigator be required to pay premiums into, a no-fault 
insurance system which will compensate subjects for unfore- 
seen harm, the possibility of which was not mentioned to 
the subject by the investigator in the process of obtaining 
informed consent? 

(ii) If such a system is established, should subjects be compen- 
sated not only for unforeseen harm but also for improbable 
catastrophic harm , the possibility of which had been foreseen 
and explained by the investigator to the subject in the 
process of obtaining informed consent? 

(iii) Is there more of a duty to compensate catastrophically 
harmed nonparticipants from whom informed consent was 
never sought, in comparison to subjects who gave informed 
consent after having been warned of the small risk of 
foreseeable catastrophic harm? Assuming that the harm is 
not catastrophic , is there greater responsibility to 
compensate nonparticipants who never gave consent, in 
comparison to subjects? 

What is the operational meaning of that greater duty toward 
nonparticipants who did not give consent, if such duty exists? 

(iv) After a social science experiment is over, does the research 
investigator have the responsibility to insure the status 
quo ante — to ensure that subjects are left after the experiment 
no worse off than they would have been had they never partici- 
pated in it, or no worse off than they were when it began? 
Does the researcher have, for instance, the obligation to 
guarantee reinsurability for participants in a health insurance 
experiment who have allowed their pre-existing policy to lapse, 
or to ensure that subjects in a housing allowance experiment 
can obtain, if and when they are compelled to leave their 
experimentally subsidized housing at the end of an experiment, 
housing equivalent in quality and price to what they had before 
they began receiving the allowance? 

(v) Has the investigator a similar responsibility to restore 

the status quo ante for a subject who withdraws in the middle 
of a research project? 

(vi) Has the investigator a similar responsibility toward a non-subject 
in the experimental community who emerges harmed at an experiment's 
termination? 

These, and other compensation issues, warrant amendments to Part 46 and 
consideration by the Commission. 

18-15 



Conclusion 

Based upon the conceptual framework of a biomedical research model, 
the current regulation on protection of human subjects is inappropriate, 
in a number of major respects, to effective regulation of social science 
research. The response to the regulation, both by non-"H'\ agencies of 
the Department and by private research investigators, indicates that it 
is either not being applied to social science research at all or, where 
applied, has the potential of skewing substantially the data collected 
by that research. 

The Commission ought closely to examine the current regulation in order 
to determine what amendments to it, if any, should be recommended in 
order to maximize the protection actually available to human subjects 
and to other persons at risk arising from social science research funded 
by the Department. 



18-16 



Footnotes 

1. On risks arising from publication or policy application of research 
results , see also Alice M. Rivlin and P . Michael Timpane (editors) , 
Ethical and Legal Issues of Social Experimentation , Washington, D. C. : 
The Brookings Institution, 1975 /hereinafter, "Brookings conference^/, 
pp. 73, 78, 81. 



2. 


Ibid. 


. P- 


52. 


3. 


Ibid. 


. P- 


78. 


4. 


Ibid. 


. P- 


73 


5. 


Ibid. 


. P- 


65 


6. 


Ibid. 


. P- 


107. 


7. 


Ibid. 


. P- 


114. 


8. 


Ibid. 


, PP 


. 69-70. 



9. Lloyd B. Lueptow (Summarized by Keith Baker), Bias and Non-Reponse 
Resulting from Informed Consent Procedures in Survey Research on 
High School Seniors , unpulished, PHEW: Office of the Assistant Secretary 
for Planning and Evaluation, January 1976, pp. 43, 6. 

10. Donald Campbell et al, Protection of the Rights and Interests of Human 
Subjects in Program Evaluation, Social Indicators, Social Experimentation, 
and Statistical Analyses Based Upon Administrative Records , Preliminary 
Sketch, January 1976, p. 13. 

11. Brookings conference, pp. 77, 95, 110. 125, 171-72. 

12. Ibid ., p. 172. 

13. Ibid . , p. 110n4. 

14. On compensation of subjects and on restoration of status quo ante , see 
ibid . , pp. 11, 18, 54, 57, 63, 70, 76, 77, 103, 1747 



18-17 



IV 
RISK/BENEFIT CRITERIA 



19 

SOME PERSPECTIVES ON THE ROLE OF ASSESSMENT OF RISK/BENEFIT 

CRITERIA IN THE DETERMINATION OF THE APPROPRIATENESS 

OF RESEARCH INVOLVING HUMAN SUBJECTS 

Bernard Barber, Ph.D. 
December 1975 



INTRODUCTION 

The draft of a recently compiled Annotated Bibliography on 
the Protection of Human Subjects in Social Science Research 
(Washington, D.C.: Bureau of Social Science Research, 1975, mimeo.) 
speaks of "the scarcity of material which is explicitly concerned 
with the assessment of risk for subjects involved in social science 
research." This scarcity or lack has now, fortunately, been consider- 
ably corrected by Dr. Robert Levine's staff paper for the Commission, 
"The role of assessment of risk-benefit criteria in the determination 
of the appropriateness of research involving human subjects," (mimeo., 
Oct. 27, 1975). Since Dr. Levine did not limit his discussion to 
biomedical research but referred to behavioral research as well, and 
since I find his analysis altogether excellent in its cogency, its de- 
tail, its comprehensiveness, and its examples, I can be most useful by 
directly orienting my paper to his. In the first part of my paper, as 
I take up some general issues in the assessment of the risk-benefit 
ratio in behavioral research, I will be trying to add to, refine, 
extend, set in perspective, and evaluate Dr. Levine's discussion. In 
the second part of my paper I will present some findings from a small 
study I have done of the actual experience during the last three years 
of the Columbia University Human Subjects Review Committee, the peer 
review committee responsible for all the non-medical research carried 
out by the Columbia faculty. I will also present a few other available 
data on actual experience in peer review groups with the risk-benefit 
issue. Finally, in the third and last part of my paper, I would like to 

say something about ongoing and needed research on the risk-benefit issue. 

19-1 



Too much of the discussion of the ethical problems of using human 
subjects in research proceeds in terms of ethical abstractions not 
clearly related to the empirical data they are supposed to clarify 
in order for us to make ethical decisions. I find the ethical ab- 
stractions of values not all that hard to come by; they are not 
esoteric; they are usually available even to informed common sense. 
But the facts to which they refer, those that make it possible to 
estimate the weight of the several ethical abstractions and to bal- 
ance off these values one against another in the process of ethical 
decision, those are often not available in any systematic and reliable 
form, nor are they easy to collect. That is why we need so much re- 
search for all aspects of the Commission's deliberations. For ex- 
ample, there is no lack of ethical abstractions for the discussion of 
fetal research or psychosurgery, to take two issues on which the 
Commission is specifically charged with responsibility. What has been 
lacking are reliable data on which to base established ethical principles 
for these two areas. The Commission has now supported useful research 
in both of them. More research is also essential to the Commission 
for its deliberations on the ethics of behavioral research on human sub- 
jects. 

SOME GENERAL ISSUES IN THE ASSESSMENT OF THE RISK-BENEFIT 
RATIO IN BEHAVIORAL RESEARCH 

1. Is the assessment of the risk-benefit ratio in behavioral re- 
search fundamentally different from or similar to such assess- 
ment in biomedical research? 

19-2 



During the last few years, as behavioral researchers have become 
aware that their work was to be subject to ethical peer review in the 
same way as that of their biomedical colleagues, they have responded 
with much of the same uneasiness, hostility, and conservatism earlier 
displayed by these biomedical colleagues. (See Bernard Barber, 
"Liberalism Stops at the Laboratory Door," 1975, mimeo., and Barber, 
"Social Control of the Powerful Professions," 1975, mimeo.) As a part 
of their complaint against the imposition of ethical peer review on 
behavioral research by the D.H.E.W. regulations in 1971, they have 
said that their work should not be covered by "the medical model" that 
they allege is implicit in the D.H.E.W. regulations. Just how their 
work with human subjects, and just how the problems of ethical control 
in their area, are different from "the medical model," they do not make 
quite clear. Yet they are raising an important question. How different 
are the ethical problems of behavioral and biomedical research? Is the 
assessment of the risk-benefit ratio different in behavioral and bio- 
medical research? 

Perhaps just because Dr. Levine did not in his paper set himself the 

task of answering this question directly, indeed he did not take it as 

in any way his task, I find the answer that is implicit all the way 

through the paper all the more convincing. That answer, it seemed to me 

as I read and re-read Dr. Levine' s paper, is that the similarities are 

far, far greater than the differnces between biomedical and behavioral 

research in respect of the problem of assessing risk-benefit ratios. I 

found large and fundamental similarities with regard to such matters in 

Dr. Levine's discussion as: (a) the basic meanings of what are injuries, 

what are benefits; (b) the specification of the significant dimensions 

19-3 



of risks (likelihood, severity, duration, reversibility, early 
detection, ability to treat or correct) and benefits (Dr. Levine 
himself says, p. 38, "The benefits may be analyzed similarly whether 
the research is in the biomedical or in the behavioral field."); 

(c) in his classification of categories of risks and benefits 
(physical, psychological, individual and social, legal, and economic); 

(d) in his list of some of the specific psychological harms that may 
occur from biomedical research (fear of rejection, guilt and self-blame, 
distrust); (e) in the nature of the task of assessment of the balance 
of risks and benefits; and, (f) in the question of where authority and 
control in the assessment process ought to exist. As is indicated in 
the sentence on p. 38 of Dr. Levine's paper quoted above, occasionally 
even he makes the fact of similarity quite explicit. It is also clear 
from his frequent references to examples and consequences of behavioral 
research; the implicit assumption of these references is the similarity 
to biomedical research. This similarity is also manifest in "A Check- 
list of Ethical Issues in Social Experiments" prepared after a recent 
two-day Brookings Institution Conference on Ethical and Legal Issues of 
Social Experimentation, a Conference in which social and medical experi- 
ments were explicitly compared with one another. The Checklist's section 
on the question, "Have you specified and reviewed the benefits and harms 
of your experiment?" is no different from what such a section would look 
like for biomedical research. (Alice M. Rivlin and P. Michael Timpane, 
eds., Ethical and Legal Issues of Social Experimentation , Washington, 
D.C.: Brookings, 1975. See also, Henry W. Riecken and Robert F. Boruch, 
eds., Social Experimentation: A Method for Planning and Evaluating Social 
Intervention , New York: Academic Press, 1974, pp. 246-8, 252-3.) 

19-4 



Fundamental similarity is not, of course, identity. While the 
principles and procedures of risk-benefit assessment in both fields 
are fundamentally the same, we should be alert to and responsible for 
such differences as also occur. But I think we get a better start on 
understanding risk-benefit assessment in behavioral research if we 
start with the fact of similarity and the useful immediate guide that 
gives us to good practice with respect to behavioral research. Indeed, 
there is no good evidence that peer review groups considering behavioral 
research protocols have not been able to operate with the standard 
D.H.E.W. regulations for all research. It seems to me that the burden 
of an argument for difference, whether it is a general argument or 
applies only to specific points, lies with those behavioral researchers 
who choose to assert it. 

On one important matter of similarity or difference between be- 
havioral and biomedical research, the relative overall amounts of 
riskiness or injury, on the one hand, and of benefits, on the other, 
I am not now committing myself. This is a more complex aspect of the 
problem of similarity and difference, hard to discuss in the absence 
of data, not to be left to mere opinion or prejudice. I will come back 
to this issue later after considering a little further what we mean by 
risk and injury. 

My summary view, then, . is that there is wery large similarity and 
overlap in all fundamental aspects of the problem of assessment of risk- 
benefit ratios in biomedical and behavioral research. Some consequences 
follow from this similarity which it is useful to point out. One 
consequence is that despite the fact that research institutions often 
have different ethical peer review committees for behavioral and bio- 

19-5 



medical research because of the different technical substance in- 
volved in these two kinds of research, these different committees, 
because of the great similarity of their tasks, ought to have much 
more communciation and cooperation with one another than they now 
do. Another consequence is that it would be very helpful all around 
to include more behavioral researchers in the process of establishing 
general principles and rules for ethical treatment of the human subjects 
of research. We all owe a great debt to N.I.H. and D.H.E.W., where 
this process has been chiefly located, but it has perhaps been too 
largely in the hands of biomedical researchers. Behavioral scientists 
would be especially valuable for their awareness and insistence on the 
necessity for researched-based data and decisions in all aspects of 
the ethics of experimental research on human subjects. 

2. The dimensions of "risk": amount and probability of injury. 

Before proceeding further, it will be helpful to discuss a small but 
important definitional point made by Dr. Levine at the very beginning 
of his paper. He is quite right, I think, in feeling uncomfortable with 
the ambiguity in the meaning of the term "risk" as it is now used in a 
taken-for-granted way in all discourse on risk-benefit ratios. He is 
right that this now-standard usage actually implies two different elements, 
one the amount of injury or harm that may be done to research subjects, 
two the probability that the estimated amount will occur. If we make 
these two elements explicit, by calling one "amount of injury" and the 
other "probability," we gain a number of advantages. First, we make it 
explicit that we are talking about injury, a very concrete term and one 
that leads on quite directly to making ^ery specific statements about the 
nature of that injury. Second, we see more easily that there can be a 

19-6 



varying relationship between injury and probability. Small injuries 
may be very probable and large injuries may be most improbable. Such 
various combinations are important for the decisions made by peer 
review groups. Indeed, the researcher may want to provide, and the 
peer review groups may want to require him to furnish, a set of 
possible injurious outcomes of research, consisting of different 
combinations of amount and likelihood of injury under different con- 
ditions. Anything we can do to be clear and specific will make our 
peer review group decisions easier and better. This new usage, in 
which amount of injury and its probability of occurrence are both 
specified so far as possible,, would be equally applicable and equally 
valuable in both biomedical and behavioral research. 

3. The "biological person" and the "social person". 

We may clarify the underlying issues and see the fundamental 
similarity in the outcomes of biomedical and behavioral research a 
little more clearly still if we adopt for present purposes a somewhat 
different classification of the types of injuries ("risks") and 
benefits than the one Dr. Levine has taken over from common usage 
(physical, social, psychological, etc.) We may speak of injuries 
or benefits being done to or occurring to either the "biological 
person" or to the "social person". It has been the pattern in 
discussion of the ethics of the use of human subjects to say that there 
are two essential issues, "the risk-benefit ratio" issue and 
"informed voluntary consent" issue. The implication of this way of 
speaking is that the first issue, risks and benefits, concerns 
only the "biological person" and that the second issue, informed 



19-7 



consent, concerns the "social person". But we can now see from Dr. 
Levine's discussion and examples that the injuries and benefits of 
biomedical research, as much as of behavioral research, involve the 
"social person" as well as the "biological person". Biomedical research 
can injure the "social person" by causing him to feel guilt or 
anxiety or a sense of being discriminated against; it can injure the 
social body by creating distrust between physician-investigators and 
their patient-subjects or by discriminating in its selection of research 
subjects, as by using the poor disproportionately often, for all types 
of experiments and, even worse, disproportionately often for those 
experiments where the risk-benefit ratio is unfavorable. (For the 
data on these two patterns of discrimination, see Bernard Barber, 
John Lally, Julia Makarushka, and Daniel Sullivan, Research on Human 
Subjects , New York: Russell Sage, 1973.) For biomedical research, then, 
it is not just "informed consent" that applies to the "social person" 
but all aspects of that research. 

Indeed, perhaps we can see this point more vividly by noting 
that to a considerable extent it is the social and cultural definitions 
of a society that determine what is to be considered an injury even 
to the "biological person". As we consider the difficult questions of 
of what is to be considered an injury to the fetus or the terminally 
ill as "biological persons," we see how much the "biological person" 
is socially defined. Even for biomedical research, then, the 
"biological person" and the "social person" overlap and blend into one 
another. 

For behavioral research, of course, the overlap and merging 

are clearer. For one thing, there is less possibility, obviously, 

of harm to the "biological person," though Dr. Levine gives some 

examples from psychiatry where this occurs. The primary concern in 

19-8 



behavioral research is with injury to the "social person". In this 
perspective, there is no difference for behavioral research between 
the "risk-benefit ratio" issue and the "informed consent" issue. 
All injuries on either ground tend to be to the "social person". 
Violations of informed consent regulations are as much injuries to 
the "social person" as are injuries to personal esteem or reputation. 
"Deception" in psychological experiments or in social research (through 
the use of "unobtrusive measures") are injuries even though a peer 
review group may mark them down as a violation of the rules of informed 
consent. 

In sum, when we consider the degree of relativism in our defi- 
nitions of the "biological person" and the "social person," when 
we see how they overlap and interact with one another, we are impressed 
with the great similarity of the possible injurious outcomes of 
biomedical and behavioral research. In making ethical decisions 
about the use of human subjects in any kind of research, ultimately 
what we are interested in is the moral status of the "social person". 

4. The fact and necessity of risk-benefit assessment. 

Absolutistic and perfectionist thinking about risk benefit 
assessment procedures, which occurs among some biomedical and behavioral 
researchers, and not least of all among those of them who are opposed 
to such procedures, is the great enemy of realistic and continuing 
attempts to achieve improvement in these matters. We should be 
impatient with absolutistic and perfectionist thinking which expresses 
itself in such declarations as, "You can't truly get informed consent," 
or "You can't really make a risk-benefit ratio assessment". Realistic 



19-9 



thinking on risk-benefit ratio assessments, whether in biomedical or 
behavioral research, proceeds on the premise that a considerable 
amount of such assessment will in fact be easy, another and smaller 
amount may be difficult but still possible, and that only a very small 
amount win be so difficult as to be considered "impossible". The 
practical and moral necessity for such assessment is obviously there, 
and the fact of relatively successful performance is also clear. Just 
as we make rough but approximately satisfactory risk-benefit assess- 
ments in the thousand-and-one routine and extraordinary activities 
of daily life, so we now have a considerable experience with the 
fact that biomedical and behavioral research peer review groups are 
making risk-benefit assessments on the same terms and in a routine way. 
For the majority of such assessments, the easy ones, there is no great 
moral or cognitive strain on the peer assessors. But for the more 
difficult ones, as we learned by mail questionnaire and personal 
interview from the six hundred or so biomedical researchers who par- 
ticipated in our two studies and made such risk-benefit assessments 
for us of "hypothetical but real" research protocols, there is some 
strain. (See Barber, et al . , op. cit .) Nonetheless, in a sufficient 
number of cases, not only among our study respondents, but in actual 
peer review groups, scientific peers do overcome this strain and make 
conclusive assessments. 

We should remember that, even where difficulty and strain occur 
in the assessment process, they are worthwhile just because the process 
of making assessments has value over and beyond the outcome or product 
of the process. Whether routine or difficult and causing strain, the 



19-10 



process of explicitly estimating injuries (amount and probability) 
and benefits (again, amount and probability) is important in itself. 
The process is in itself "consciousness-raising*," it leads to higher 
ethical awareness. One hopes that the product , now or eventually, will 
also be better, but that happy condition we should not expect ourselves 
to guarantee. We should not expect, and certainly not require perfec- 
tion of risk-benefit assessment in all cases from our biomedical or 
behavioral review groups. Perfection in all cases is for Utopias 
and heavenly worlds. Moreover, we should inform the general public 
that we do not guarantee perfection of assessment product but only 
excellence in the process . We should inform them that we are prepared 
to defend scientific research assessment, when it is conscientiously 
and competently carried out by professional peer review groups, 
against all demands for Utopian perfection of product . 

Another way in which unrealistic expectations for risk-benefit 
assessment products express themselves is in the call for quantitative 
and complexly mathematical formulations and specifications of the risk- 
benefit balance in any particular piece of biomedical or behavioral 
research. I do not think that the use of the terms "outweigh" and 
"sum" in the D.H.E.W. regulations about the risk-benefit ratio is 
intended to be anything more than metaphorical. In everyday 
language, certainly, we use such terms in full understanding of their 
metaphorical character. The D.H.E.W. regulations enjoin the peer review 
groups only to be prudential, to do the best they can as they think 
about "balances," "sums," and "weights". We have too much the 
fearful tendency to think that D.H.E.W. is expecting more of us than 
we can produce, that we must search for hidden meanings and covert 

19-11 



expectations in its necessarily vague and metaphorical language. 
There are, of course, those who have developed elaborate and formal 
mathematical equations for a variety of social processes, systems, 
and "cost-benefit" ratios; they would love to have peer review 
groups try out their exercises. But I think Dr. Levine is correct 
when he says (p. 48) that "At this point it seems appropriate to 
avoid using mathematical models to calculate risk-benefit ratios...". 
Rougher and simpler modes of "measurement," what the sociologist 
Paul F. Lazarsfeld has called "qualitative measurement," is more than 
adequate for a satisfactory ethical process right now in making 
risk-benefit assessments. Wherever more quantitative measurement or 
even mathematical modeling are possible, of course, they should be 
encouraged, as in some epidemiological studies of injuries and 
benefits from biomedical or behavioral research. But most risk- 
benefit assessment processes cannot be fully quantitative just yet. 
On another aspect of these assessment processes, on the answer 
(pp. 53ff.) Dr. Levine has given in his memorandum to his question, 
"Who has the authority or responsibility to assess risk-benefit 
criteria in the determination of the appropriateness of research?" 
I should like to make some comments. I agree with Dr. Levine that 
a "central role" should be assigned to the IRBs, though, as I have 
argued elsewhere (Barber, et al . , op. cit ., Ch. 11), it is important 
that IRBs should include lay outsiders in all cases and, in some, 
also medical specialist outsiders from other research institutions. I 
further agree with Dr. Levine that the subjects themselves have an 
important part to play in prudential assessment processes. Finally, I 
think he has well described the role that national review boards could 



19-12 



play in especially difficult assessment decisions and also in improving 
and making easier the work of the local committees. 

Nevertheless, because the social interaction processes involved 
both in risk-benefit ratio assessments and in informed consent 
procedures are quite complex, include a number of different significant 
social actors and an extended time period not covered by the authority 
and control mechanisms Dr. Levine has discussed, I would like to 
see other authority and social control mechanisms included. The 
ethical education of the physician, either in medical school or 
thereafter, is not yet satisfactory. A more satisfactory education 
ought to be an important additional support for satisfactory and 
authoritative risk-benefit assessments. So too ought strengthened 
and more self-conscious informal peer control mechanisms, such as 
informal conversations, consultations, advice and even interventions. 
Finally, I would like to see some better education for potential 
research subjects, who are all of us, in both their rights and obli- 
gations as research subjects. Here, as in other social realms, 
there is probably a useful role for a variety of responsible "consumer" 
protection agencies, especially with regard to the protection of 
the particularly vulnerable social categories such as children, 
prisoners, and the mentally ill, where their own resources are not 
sufficient to participate prudential ly in the process of being 
research subjects. Certainly, for the improvement of risk-benefit 
assessment processes we need to pay a good deal of attention to who 
is involved in those processes, and how, that is, what knowledge of 
and control over the processes they actually have. Research on these 
matters will be valuable. 



19-13 



5. Is behavioral research less injurious than biomedical 
research? 

One way of summing up our comparative perspective on risk-benefit 
assessment in biomedical and behavioral research is to ask ourselves, 
Is behavioral research less injurious than biomedical research? That 
is, overall is there a better risk-benefit balance for biomedical than 
for behavioral research? The quick and all -too-current answer, of 
course, is yes. Since biomedical research is more often than 
behavioral research a life-and-death matter, both causing grievous 
injury sometimes but more often bringing life itself, it is on the 
whole productive of a better risk-benefit ratio. 

Andyet, before we think there is a great dissimilarity, we 
need to look at a few qualifying facts. First, it is important to 
remember that there is very little life-and-death research even in 
biomedicine. The study my colleagues and I did of some 300 biomedical 
researchers using human subjects and who reported to us about 424 
different research projects in which they were involved showed that 
most research is both scientifically and ethically trivial, far from 
being a life-and-death matter. Taking note of this fact, and 
remembering also that we do not have even a rough calculation of the 
total amount of harm and good that either biomedical or social 
research has given us, we may well express a little hesitation about 
making firm and precise comparisons of the two. Finally, we have 
to remember that many people consider some social injuries and benefits 
even more important than biological health and life itself. Orwellian 
1984' ish nightmares about social slavery and total thought control 
can seem more real and more horrible to people than harm, or even 



19-14 



death, done to the "biological person". Insofar as it is conse- 
quential for such fundamental injuries and benefits, behavioral 
research is obviously of great importance to us. The assessment of 
its risk-benefit ratios is not a small or indifferent matter. 

In sum, while we may agree that, overall, probably more hangs 
in the balance from biomedical research, still a great deal of the 
greatest importance is involved in the injuries and benefits of 
behavioral research. The assessment of the balances of these injuries 
and benefits is of the first importance for the ethics of scientific 
research using human subjects. If it is not the most important 
problem we have in this field, neither can it by any moral or 
prudential standard be called unimportant. 

SOME EMPIRICAL DATA ON RISK-BENEFIT ASSESSMENT IN BEHAVIORAL RESEARCH 

I have been arguing for the fundamental similarity, in principles 
and procedures, of risk-benefit assessment in biomedical and behavioral 
research. Arguments, of course, and especially one so important 
for policy as this one, should be supported by facts, and preferably 
systematically collected and reliable facts. What are the facts 
regarding actual processes of risk-benefit assessment in biomedical 
and behavioral research? 

Unfortunately, but just as is the case in nearly all areas 
of the ethics of research using human subjects, good data are hard to 
find. What we have instead are mostly scattered, unsystematic 
statements, as well as a few more systematic facts, not so much 
because they prove the case for or against the argument of fundamental 
similarity, but simply as a basis and a background for the better data 

19-15 



that ought to be built on them. 

For eventual comparative purposes, though the data for precise 
comparison from behavioral research are not now available, we may 
start with some data on risk-benefit assessments in biomedical research. 
First, as to amount of risk (probability of risk occurring was not asked) 
When we asked some 300 biomedical researchers at University Hospital 
and Research Center to estimate the amount of risk involved in 422 
different studies they were doing on human subjects, they said that 
1% (4 studies) involved "high risk," 2% involved "moderate" risk, 
8% involved "some" risk, 45% involved "very little" risk, and 44% 
involved "no risk at all". (Barber, et al . , op. cit ., pp. 39,45.) 
Second, when we asked these researchers also to estimate amount 
of benefit for subjects, and amount of possible benefit for future 
patients, we were able to establish risk-benefit ratios. We dis- 
covered that in 18% of the studies, risk outweighed benefit to subjects ; 
these we called the "less favorable" studies. We also discovered 
that some of these 18%, amounting to 8% of the total of 422 studies, 
were what we called the "least favorable" studies because the risk- 
benefit ratio was unfavorable even when we added benefi t to future 
subjects to benefit to present subjects . (Ibid., pp. 47,50) 

For present comparison with these data, all we have is some 
unsystematic data from three institutions, University of California, 
San Diego; University of California, Berkeley; and Columbia University. 
The San Diego data were presented by Professor George Mandler at a 
recent symposium on the ethics of research on human subjects at an 
annual meeting of the American Psychological Association. (Behavior 
Today, Sept. 29, 1975, 573-574.) Mandler reports that his university 



19-16 



has separate committees for behavioral and biomedical research; this 
practice is followed by many major universities, though we do not 
know how many. He further reports that at San Diego, whereas "about 
90% of the research projects generated by the medical school involve 
some risk," only 25% of the research submitted to the behavioral 
peer review group involves some risk. It should be noted that the 
90% "some risk" figure for the San Diego biomedical group compares 
with a figure of 56% in the university hospital and research center 
studied by me and my colleagues. 

In his "DHEW Regulations Governing the Protection of Human 
Subjects and Non-DHEW Research: A Berkeley View," (mimeo., 1975) 
Professor Herbert P. Phillips, the Chairman of the Committee for 
Protection of Human Subjects, a committee which covers only behavioral 
research, reports that "the vast number of projects currently examined 
by the CPHS involve 'no risk 1 or extremely low risk to the human 
subjects". What does "vast number" mean? Professor Phillips 
continues: "Under the present review system, no more than 10-15 
out of every 100 cases that we examine require a modification of 
research design to better protect the human subjects; and in the 
vast majority of these 10-15 cases the 'risks' to the subjects are 
so self-evident that the cases would have to come to the CPHS's 
attention, whatever the system of review." Professor Phillips 
concludes with a statement that is representative of the views of 
those behavioral researchers who would like to alter the present 
procedures of risk-benefit assessment for their field: "It just does 
not seem reasonable to have 85-90% of Berkeley researchers, and members 



19-17 



of the CPHS, waste so much of their valuable time and energy on 
lengthy, but essentially meaningless expositions proving that 
no harm will come to their subjects, or, conversely, that they are 
morally upstanding scholars. There is an element in this process 
that is clearly reminiscent of the California 'Loyalty Oath'". 

Being very much aware of the lack of empirical studies of risk- 
benefit assessment in behavioral research, when I was asked by the 
Commission to prepare the present paper I decided to do a small study 
on the experience in this field of Human Subjects Review Committee 
at Columbia University. I have been a member of this Committee for 
the past three years and am now its Chairman. It is on this 
experience that I am reporting. Unfortunately, we do not ask our 
member-reviewers to do more than indicate whether there is "some risk" 
or "no risk," so my data are not finely graded either as to amount or 
probability of risks. 

I should also report that I set down my pre^researt-h'itmpreisionsi.as 
to what my findings might be. The fact that these impressions proved 
wrong turned out to be instructive. My pre-research impression was 
that there would be relatively few expressions of concern about harm 
or injury from our reviewers; I felt that informed consent would be 
the primary issue of concern. The data showed me wrong and I realized 
that I had been thinking of injury as only to the "biological person". 
When injury of various kinds to the "social person" is assessed, then 
the risk-benefit issue turned out to have been of greater importance 
to the members of our Committee than informed consent shortcomings. 
It was this finding that led me to see the usefulness of the distinction 
between the "social" and the "biological" persons that I have presented 

19-18 



at the beginning of this paper. 

What are my actual findings? During the period from September, 
1972, to August, 1975, the Columbia University Human Subjects Review 
Committee screened 90 behavioral research proposals that passed 
through the University's Office of Projects and Grants. The members 
of the Committee include Community members, university research staff, 
and faculty members from anthropology, law, business school, 
social work, sociology, and psychology. Since it is our procedure 
to have three members review each proposal, there should have been 
270 reviews for the 90 proposals. We could find only 249 in the files 
and we report on these. Of these 249, just about half, 123 (49%) 
were unqualified approvals with regard to both issues, risk-benefit and 
informed consent. 48 of the reviews (19%) raised questions about 
informed consent. 79 of them (32%) had questions about what we 
called "risk", and 49 (19%) had questions about what we called 
"confidentiality". (It should be noted that questions by reviewers, 
where there were any, could add up to more than 50% because a reviewer 
could mention both issues in the same review.) Thus, one in three 
of the individual reviews raised explicit questions about what our 
standard check-list calls "risk". But though an additional 49 (19%) 
of the mentions were about what the check-list calls "confidentiality," 
it is clear from the comments of the reviewers who mentioned this 
that they were thinking of injury to the "social person" just as 
much as when they mentioned "risk". For them, violations of con- 
fidentiality were just as much injuries as is the "risk" of damaging 
the individual's self-esteem or his social reputation. "Risk" 
factors seem to be those that directly cause such harm as embarassment 



19-19 



or loss of reputation. "Confidentiality" still involves potential in- 
juries though the harm it causes in the form of embarrassment is indirect , 
a result of making the individual's identity visible and thereby 
exposing him to harm. Altogether, then, injury to the "social person" 
is thought by our reviewers to occur more often than just the 
explicit mentions of "risk" would suggest. These data from the 
H.S.R.C. experience indicate that some amount of risk is a not 
infrequent occurrence in behavioral research. 

What are some of the injuries our reviewers mentioned? The 
list contains no surprises: embarrassment, loss of privacy, 
disclosure of confidential information, danger of arrest, adverse 
effects on family or larger social network relationships, anxiety, 
fear, self-incrimination, and harmful new self-awareness. Each of 
these general categories includes a number of different specific 
cases; it is the useful function of the Committee reviewers to 
discern the general harm in the variety of specific concrete cases. It 
would probably be a helpful guide to researchers if a list of such 
general categories of potential harm could be published, including 
several recurrent and representative cases for each category. 

SOME CURRENT AND NEEDED RESEARCH 

I should like to end this paper as I began it, with an emphasis 
on the need for more and better research on the problem of risk-benefit 
assessment in behavioral research. Such research should be explicitly 
comparative with research on biomedical risk-benefit assessment. It 
should also be systematic and cumulative, with each piece of work 
building and imporving on research that has gone before. Research 



19-20 



should test all our assumptions and seek to make the process of 
risk-benefit assessment both more effective and more efficient. 
For we want assessment principles and procedures that will do their 
job well and will be least costly of researchers' and other parti- 
cipants' time. 

Fortunately, we have a few studies underway that will add to our 
knowledge and serve as valuable models beyond the very few that now 
exist. One of these current studies is the Commission's own study of 
IRB's. The other is the N.I.M.H. funded study by Drs. Glen D. Mellinger 
and Mitchell Baiter, "Public Judgments Regarding Ethical Issues in 
Research." 

We should remember that the improper use of human subjects in 
research has only recently become widely defined as a social problem. 
The task of ameliorating this social problem is not simple and is bound 
to take time. Effective remedies will require no small amount of 
social change in several social circles. Expert, and expensive, 
social research of the kind represented by the Commission's study of 
IRB's and the Mellinger-Balter study of public views has an essential 
part to play in making this social change possible. 



19-21 



' 



20 



THE ROLE OF RISK/BENEFIT ANALYSIS IN THE CONDUCT 
OF PSYCHOLOGICAL RESEARCH 



Gregory Kimble, Ph.D. 



The Role of Risk/Benefit Analysis in the 
Conduct of Psychological Research 
Gregory A. Kimble 
University of Colorado 

Concern about the ethics of psychological research is a fairly recent 
development and the reasons for the development of this concern are of 
some interest. As long as the psychological investigator confined himself 
to rats learning mazes, to college students mastering lists of nonsense 
syllables, to his own colleagues' psychophysical judgments of stimulus 
intensities in the interest of constructing scales of sensory magnitudes 
and to studies of eyelid conditioning aimed at uncovering the basic "laws" 
for a behavior system there were no serious problems. About the most 
serious moral accusation anyone could make about such research was that 
it was pretty much of a bore for the subjects who participated. It was 
when the science began to study obedience to authority, racial differences 
in cognitive abilities, the behavior of homosexuals in public places, the 
decision-making activities of the members of juries and the personal 
characteristics of people hired for and fired from governmental positions 
that serious questions began to arise. The general point to draw from 
this comparison is that the more important the topic of investigation, the 
more sensitive are the ethical issues it raises. Research on the more 
recent topics invades the privacy of the individual in important ways. 
In some cases disclosure of the information obtained might put the person 
in danger of losing his reputation or of being arrested and jailed. 

Looking to the future it is clear that we will have to continue to 
face these ethical issues for two reasons, first psychology has developed 

20-1 



methods that allow the effective investigation of important social and 

personal issues and second the situation in the world demands such investigation 

whatever the consequences produced directly by research. 

As my grandmother would have put it, "The world is going to Hell in 
a handbasket," or as the more polite would say "The quality of life is 
deteriorating." The earth is overpopulated and there are places where 
people starve to death. Our supply of fossil fuel is about exhausted. 
Alternative sources of energy are not being developed rapidly enough and 
people have not changed their behavior in ways that would conserve what 
we have. The concern of people for the welfare of each other has reached 
a new low, the well documented refusals of people to come to the aid of 
others in trouble being the obvious reference on this point. In last 
night's paper there was the story about a gang of teenage hoodlums who 
boarded a bus in San Francisco, beat up and robbed passengers and tried to 
repeat the performance on another bus before the police stopped (but did 
not arrest) them. Over fifteen percent of the population have failed to 
develop the intellectual skills required to deal effectively with a 
newspaper ad for groceries. The list could go on: "pollution," "divorce," 
"child abuse," "the alienation of youth," "the frustrations of middle 
life" and "the indignity of old age" suggest just some of the problems 
that could be presented in more detail but that is not my purpose here. 

My immediate purpose is to direct your attention to the fact that 
most of the problems I have hinted at above are psychological problems. 
Physical technology will not provide the required solutions which must 
come from knowledge about behavior. Unfortunately the knowledge does not 
exist and only research will provide it. Research, however, puts the 



20-2 



research participant at risk. This last Doint poses the question to be 
discussed: Are these risks appropriate given the benefits research provides? 

Risk/Benefit Analysis 

Consider, to make the point concretely, the research of M. M. Berkun 
and his colleagues (1962) on stress in simulated wartime situations. In 
one experimental condition a group of army recruits were passengers aboard 
an apparently stricken plane that had to crash land. In other conditions 
recruits were subjected to a reported threat of accidental nuclear 
radiation, to the telephoned information that a forest fire had surrounded 
their outpost, and to a fictious report that they were being subjected to 
artillery fire by members of their own army. In all of these situations 
the realism of the crisis was enhanced by the use of noise, darkness, 
rugged terrain, smoke or whatever was required. In each of these situations 
the recruits' radio transmitters, the most likely instrument for securing 
help, "failed" and the behavior of interest was the recruits' effectiveness 
in trying to repair it. Under the stress of the situation some of the men 
left in cowardly retreat and many showed other signs of severe distress. 
Was it all worth it? 

The standard answer to this question these days takes the form of a 
"risk/benefit" analysis. The risks born by the subjects in the experiment 
(the experience of terror, being deceived, living with the knowledge of 
cowardly behavior) are to be weighted against the benefits provided by the 
study (development of psychological screening measures, better knowledge 
about the effects of stress, possibly a more effective army). If the 
aggregate of benefits outweighs the risks the experimental procedure is 
justified; otherwise it is not. 

20-3 



In the rest of this essay I shall subject the risk/benefit analysis 
to its own analysis. I shall show, I think, two things: (1) that in any 
respectable mathematical sense of the concept risk/benefit analysis is a 
practical impossibility in this context but (2) that it brings certain 
considerations into focus in ways that contribute to the decision as to 
whether a particular piece of research deserves to be carried out. 

The Dimensions of Complexity 

In the abstract the idea of subjecting research plans to a risk/ 
benefit analysis is very attractive. What would be involved would be 
the development of the ratio: aggregate of all the benefits/aggregate of 
all the risks. Ratios greater than 1.0 would allow research to procede. 
Ratios less than 1.0 would put a stop to it. In practice, however, the 
situation is a bit like that of Alice in Wonderland after she had eaten 
the cake that made her shrink: 

"The first thing I've got to do," said Alice to herself, 

as whe wandered about in the wood, "is to grow to my right size 

again; and the second thing is to find my way into that lovely 

garden. I think that will be the best plan." 

It sounded like an excellent plan, no doubt, and very neatly 

and simply arranged: the only difficulty was that she had no 

idea how to set about it (Carroll, 1946) 

With risk/benefit analysis things are similar. The plan is excellent- 
very neatly and simply arranged— but faced with the problem of carrying the 
plan out it seems unlikely that anyone has any idea of how to set about 
it. I turn now to some of the reasons for this state of affairs. 



20-4 



Number of Variables 

One of the most obvious points to make is that even for a single piece 
of research the number of risks and benefits that have to be considered is 
enormous. In his extensive review of this topic Levine (1975) breaks 
risks and benefits down into categories that apply to individuals and 
categories that apply to groups. Then he proceeds to identify several more 
specific items in each grouping. They include physical, social, legal and 
economic risks and benefits. Since each of these can be further broken 
down into many specific types of risk and benefit the list quickly becomes 
so long that it seems unlikely that a manageable risk/benefit equation could 
be constructed from these terms. 
Subjectivity of Risks and Benefits 

There is also another point to make: the significance of the components 
of the equation involve matters where there must be great individual differences 
and great differences among various social groups. In this connection, 
consider what must be the most controversial psychological research from 
an ethical point of view, that of Milgram (e.g. 1965). As most readers 
of this essay will know Milgram demonstrated that a good many Americans, 
prodded by nothing more than firm direction to do so, will administer 
a dangerously strong electric shock to a fellow human being just for failing 
to produce the right answers in a faked study of paired-associate 
learning. 

The ethical question raised by this research involves the consequences 
for a subject of discovering this unpleasant truth about himself. For most 
Deople the effect would probably be ego-destructive. The extent of this 
reaction would vary for different people and for members of different social 

20-5 



groups (Smart and Smart, 1965). The assessment of risks, for this reason, 
would require (sometimes unattainable) information about the reactions of 
different individuals and groups to the same treatment. As we shall see 
later obtaining such information raises ethical questions of its own. For 
the moment it is important only to note that the subjective nature of risks 
and benefits adds to the problem of putting them into any realistic ratio 
form. 
The Problem of Aggregation 

Even if a catalogue of all the possible risks and benefits of 
psychological research existed, this information would only set the stage 
for further problems of great difficulty. It is possible to find 
references in the literature in this area which suggest that it is the 
"sum of" the risks and benefits that are to enter the risk/benefit ratio 
(Levine, 1975, page 45 ff.). Although it seems improbable that the various 
risks and benefits combine according to the rules of simple addition, it 
is unclear as to how they do combine or that the rules of combination 
would be the same for all conceivable ways of looking at a risk/benefit 
ratio. Such unclarity comes about largely because of the nonexistence 
of an appropriate metric for the quantification of risks and benefits. 
Quantification of Risks and Benefits 

Risks and benefits appear to have three properties that might enter 
into the process of assigning quantitative values to them. Both terms 
vary in a) probability of occurrence, b) the magnitude of the effect 
(positive value of the benefit and seriousness of what is risked) and 
c) the number of people affected. I shall concentrate on the first two 
of these quantities. As the discussion develops it will become clear 



20-6 



that these alone raise so many problems that it will not be profitable to 
say much about the third. 

Risks. Confining ourselves to the major risk of subjects in the 
Milgram experiment in order to have a concrete example, we might note that 
each subject began participation with some probability of finding out an 
unpleasant truth about himself and this unpleasant truth would be in 
some measure destructive to the individual's self-esteem. But what values 
shall we place on these two aspects of risk? 

We know that the actual probability that this subject would deliver 
the very strongest shock to the second individual in the experiment was 
about .60. Perhaps this should be the probability value. This quantity 
was one of the results of the investigation, however, and could not have 
served in a calculation designed to decide whether or not the experiment 
should have been done. Perhaps this probability of .60 could have been 
estimated--say by college students who have considerable experience as 
subjects or by psychiatrists who have professional knowledge of human reactions, 
Other research tells us that this would not have produced realistic 
estimates, however. When the experiment was described to groups of college 
students and psychiatrists the students estimated that 3% of the subjects 
would administer the strongest shock; the psychiatrists estimate was only 
1%. Both were very far short of the actual probability. Obviously prior 
to the Milgram experiment this aspect of risk could not have been evaluated. 

Without developing the argument in much detail the same conclusion 
seems to apply to the estimate of the seriousness of the self-revelation 
experienced by many subjects in this experiment. Without actually 
participating in the study the subjects could not know how they would react 



20-7 



or how they would react to their own reactions 

Benefits. If anything, the points just made about risks apply with 
greater force to benefits but for a rather different set of reasons. Although 
subjects may benefit slightly from their participation in research—for 
example, by learning a little about research and knowing that they have 
contributed to an important enterprise—the eventual benefits usually go 
to others than those who actually participate in an investigation. The 
risks on the other hand are here and now. 

As a brief aside it may be worth noting that for some people this 
state of affairs raises the question of whether it is right for some to 
take risks now when the benefits go to others later. The answer to this 
question appears to be that we benefit now from the contributions of 
those who served earlier and that the bargain is not so unfair as the question 
may make it seem to be. 

Returning to the matter of quantifying benefits, there are several 
considerations that lead me to believe that it is unreasonable to 
expect any more success here than in the case of risks. 

1. Applications of much research cannot be anticipated. This point 
has been made many times but I might add one item from my own research 
history. In an old study (Kimble, 1955) of shock intensity and avoidance 
learning, I did a preliminary experiment in which I showed that rats 
responded in two quite different ways to relatively weak and relatively 
strong electric shocks. The purpose of this pilot work was just to 
establish ranges of intensities to use in the main experiment on the effect 
of various intensities. As it turned out, however, the preliminary 
experiment had important practical applications. The two reactions were 

20-8 



affected differently by drugs and this fact made important contributions 
to the study of psychopharmacology. 

2. The effects of research are cumulative. Whereas risks tend to 
be localized in time and centered on particular individuals, benefits 
are diffuse and may have their effects more through a change in attitude 
and atmosphere than through a direct influence upon any single aspect of 
the world. 

I might cite, as an example of what I have in mind here, all of the work 
done since 1898 or so by the Thorndikeans, Skinnerians, and Hullians on the 
law of effect. Although I doubt that any single one of these thousands of 
studies ever was the basis for any significant educational innovation, the 
cumulative impact of the tradition was important. The current emphasis 
on reward for accomplishment rather than punishment for failure and the 
stress on student interest and motivation seems a direct consequence of 
the major ideas in the law of effect tradition. 

3. How does one identify a benefit? The previous example will 
already have suggested to some of my readers that the neat (if implicit) 
categorization of the consequences of research as beneficial or the opposite 
is far too simple. Surely there are those who believe that the catering 

to student interest and the neglect of punishment in the schools mentioned 
above is to blame for the deteriorating cognitive competence of our 
population. Equally surely there are others who would point to exactly 
the same conditions as being responsible for the fact that American 
Scholars are the most productive in the world. If one argument is right 
and the other wrong the question of benefit hinges on which side is correct. 
If both arguments are correct, (which is possible if the effect of 



20-9 



reinforcement interacts with certain aspects of individual difference) the 
risk/benefit waters become very muddied indeed. 

It is also important in this connection to make a related point. The 
knowledge generated by research is morally neutral. I does not know or 
care whether it is used for human benefit or harm. It may in fact benefit 
some people and harm others simultaneously. To illustrate, suppose that 
research on persuasive communication tells a candidate for political office 
how to run his campaign so as to win. In that case the product of 
research has benefited the winning candidate and harmed the loser. 

A very similar conclusion arises from considerations relating to what 
happens to a single participant in an experiment. Referring to Mil gram's 
work once more, suppose that a subject in such an investigation finds out 
that he is capable of inflicting cruel and possibly fatal punishment upon 
another person under the slightest of provocation. Is this a benefit or 
not? It could be a benefit one might argue, because it is better to know 
such things about oneself in order better to control such tendencies. But 
with equal force it could be argued that such knowledge is the opposite 
of a benefit because of the damage it does to self esteem and the 
possible negative effects on ones life later. Obviously the assessment 
of benefits is a much more complicated issue than first impressions may 
suggest. 
A Temporal Consideration 

Although I have certain objections to the model it is possible to 
draw an analogy between risk/benefit analysis and multiple approach- 
avoidance conflict situations. If one does draw the parallel another com- 
plication feature enters the picture. The values of the conflicting 

20-10 



components of a conflict change in time. It seems certain that the same 
thing must happen with risks and benefits. 

To illustrate: suppose someone signs up to parti cpate in an experiment 
next week. The chief negative aspect (risk) in the experiment is that it 
will involve electric shock, perhaps so strong that the participant will 
not be able to tolerate it. The chief attraction (benefit) is that the 
subject will recieve $50.00 for his participation. Since the subject does 
sign up obviously benefits outweigh risks--but that is a week before 
the experiment. 

During the week things change. If conflict theory provides a guide 
both the subject's fear of the painful shock and his desire for the $50.00 
increase as the moment of participating in the experiment approaches. The 
fear increases faster than the desire for money and possibly to a higher 
level. It may lead the subject to drop out of the experiment just before 
the appointed hour. 

What actually happens is not so important for our purposes as the fact 
that fear increases according to a steeper function than monetary desire. 
This means that a risk/benefit calculation involving these elements will 
have different values which depend upon the point in time where the 
calculation occurs. 
Conclusion 

My own conclusion, having developed all of these points, is that the 
plan actually to calculate a risk/benefit ratio as an aid to making 
decisions about the value of a particular piece of research is unrealistic. 
There are too many variables to consider. No quantitative indices of either 
risks or benefits exist and none seems likely to be developed soon. The 

20-11 



operational bases for constructing such scales are difficult to specify. 
About all that one can say is that the scales should probably be based upon 
the values of individual subjects. Finally even if the basic measures 
were available there is reason to suppose that ratios based upon them 
would change in complicated ways in time. To repeat, the formal use of 
risk/benefit ratios for the purposes of making ethical decisions about 
research seems difficult or impossible. 

The Ethics of the Ratio Itself 

Suppose we decide that the evaluation of research plans with. the aid 
of risk/benefit ratios is not impossible but only very difficult. What 
then? At least one "then" appears to be that we have raised some issues 
that are partly logical and partly ethical. 

Much of what I have covered in the foregoing pages of this essay 
makes the critical point. Largely the risks, but not most of the benefits, 
of research are born by individual people. For this reason I suppose 
that most of us would agree that the assessment of risks should be 
heavily weighted (if not totally determined) by those risks as they are 
perceived by the individual. 

To illustrate what this means and where the argument leads let us 
consider the Mil gram type of experiment for one last time. Suppose we 
are trying to recruit a particular subject for experiment and want to make 
a calculation of risks for him. For perfectly obvious reasons we cannot 
inform him of the fact that he will probably be. led to treat another 
human being in a way that will make him feel guilty and ashamed of himself 
for a long time. Even if we could the subject would want to know more 
than the fact that the odds are 60/40 that he will obey and if he does 



20-12 



obey that he will be conscience stricken as a result. He would (or should) 
want to ask what the odds are specifically for himi and how guilty he will 
feel. 

Fortunately or otherwise, psychology does not have the ability now 
to answer such questions. But sometime it probably will and consider what 
that implies. It would mean that if we had the necessary information 
about this person's early upbringing, habits of cruelty, ways of experiencing 
guilt, relationships to authority and God knows what else we would be able 
to answer the questions raised specifically for him. But note that along 
the way we have invaded the privacy of the individual's personal life and 
that some of the information might, if disclosed affect his reputation or 
even put him in jail. In short the effective assessment of risks and 
benefits for any purpose seems certain to increase the risks. 

Such considerations add a new dimension to the issue under examination. 
Most of this paper has been devoted to making the point that risk/benefit 
analysis probably cannot be carried out for practical reasons. This brief 
section has made the further point that perhaps it should not be carried 
out for ethical reasons. 

Implications 

So where does all of this leave us? If the formal application of the 
risk/benefit calculus to the planning of research is practically impossible 
and morally objectionable, what alternative are available? In my opinion 
there are no alternatives. Some form of risk/benefit thinking is the 
only reasonable way of looking at the problem. In the concluding pages of 
this essay I shall argue for the application of a redefined and less 
formal risk/benefit equation as an aid to decision making in the conduct 

20-13 



of research with human beings. 

A General Position 

The redefined risk/benefit ratio I wish to propose looks like this: 

Amount of Knowledge to Come from Research 
Risks as Seen by Reasonable People 

I turn now to a discussion of the components of this proposed equation. 

Knowledge as the benefit of research. It is clear of course that it 
is not the process of research itself that is potentially beneficial 
to mankind. Rather it is the product of research—the advances in knowledge 
to which it leads. This obvious point raises several questions that 
deserve comment. 

Is it important to distinguish between the potential benefits of 
applied and basic research? In my opinion the answer to this question 
is "no," at least for the forseeable future in the behavioral sciences. 
The point I have in mind here is that our knowledge in the behavioral 
sciences is so limited that it will be important to carry out basic research, 
applied research and research that attempts to bridge the gap between the 
two. 

It is probably well understood that it would be a mistake in any science 
to restrict research strictly to applied problems. The trouble with such 
limited programs is that they are apt to produce results of limited 
usefulness. Typically the data obtained in applied research bear on some 
very specific problem and fail to generalize to other specific problems. 
A part of the aim of basic research is to obtain more general knowledge. 
Beyond that, as was mentioned earlier, the benefits of research are less 
predictable than one might hope and seem about as likely to come from 
basic research as any other kind. 

20-14 



On the other hand, at least in psychology, the time has come to make 
the heretical point that basic research carried out in the absence of any 
concern for applicability has its own failings. The history of research 
on the psychology of learning from roughly 1929 (Hull's Functional Interpretation 
of the Conditioned Reflex ) to roughly 1952 (Hull's A Behavior System ) will 
serve to make the point. In my personal estimation research carried out 
in this period was probably more "scientific" than research that is being 
done now. The trouble with it, however, was that the areas of investigation 
(the non-threatening topics mentioned in the first paragraph of this essay) 
appear to exist only within the artificial confines of the laboratory. It 
was when the psychologist of learning turned to more realistic lines of 
investigation (free recall, lapes of memory exemplified by the "tip-of-the- 
tongue" phenomenon, memory for the content of paragraphs) that more useful 
advances began to occur. This point now seems well on the way to receiving 
general acceptance in experimental psychology. The former disdain for 
application has now nearly been replaced by a concern for the "ecological 
validity" of experiments. 

Research quality and ethical behavior. The proposal that the numerator 
of the revised risk/benefit equation should be "amount of knowledge to come 
from research" has an interesting implication: If the equation provides 
an index of ethical research behavior (as I intend that it should) the 
conduct of bad research is unethical . This is because research that is 
poorly conceived, improperly executed or inadequately analyzed will add 
nothing to knowledge and might even contribute a negative increment. 
Under such circumstances even the most trivial risks to subjects are 



20-15 



unwarranted. 

I owe an expression of appreciation to Verna Shmavonian who started 
me thinking about how the quality of research enters the ethical picture. 
She is in no way responsible, however, for the curious twist this thinking 
finally took. 

Risks as seen by reasonable people . As with the case of benefits we 
can begin this discussion with an obvious point. It is unreasonable to 
ask for a complete accounting of risks prior to deciding to conduct a 
particular bit of research. The risks are too numerous and impossible 
to predict. Moreover, (since the only sensible way to look at these risks 
is in terms of what they mean for individual subjects) assessing the 
risks would sometimes pose greater ethical questions than the research itself. 
This last statement identifies my reason for leaving the assessment of risks 
up to the judgment of "reasonable people." 

But who are these reasonable people? I think there are three classes 
of them--the investigators themselves, subjects and institutional review 
groups. Beyond that it seems to me that in the great majority of the 
cases investigators will be the individuals in the best position to identify 
the risks. This is not just a self-serving evasion of the issue and I wish 
to push the point somewhat vigorously. 

The ethics of investigators. Why is it important for me to write 
this essay? I think because the times require it. The methods and motives 
of those who do research with human subjects are not very well understood 
by the general public. This is part of a generally anti-intellectual 
climate that places a low value on knowledge, scholarship and research. 
In such a climate it is not surprising to find that a cloud of suspicion 

20-16 



surrounds research with human beings. Either the investigator is seen as 
an irresponsible player of trivial games or else he is cast in the role 
of the bad guy--a behavioral voyeur whose aims at best are to expose the 
most scandalous aspects of the human condition. Under such assumptions 
is it not reasonable to demand a complete ethical accounting of those 
involved in research? 

Although I have no intention to minimize the importance of the ethical 
issues I do think that it is essential to attempt to restore perspective 
and I have two general points to make. The first is that the devaluation 
of scholarship is a serious threat to our survival even as a species. More 
of that later. The second has to do with the ethical values of investigators. 

Put bluntly, I suspect that these values tend to be considerably higher 
than those of a good many people with whom subjects have daily contact— 
for example the used car salesman, the TV repairman and the precinct 
politician. Moreover the research scientist is sensitive to the ethical 
issues. Long before the current spate of codes of research ethics began 
to appear on the scene, similar codes had been developed in the behavioral 
sciences. By reason of a history of concern for the welfare of his 
subjects, the research investigator is, I think, in a better position than 
almost anyone to make the ethical decisions. 

Without going into great detail on any point the following list of 
ethical principles taken from the code of the American Psychological 
Association (Cook et at., 1973) illustrates the range of considerations 
which the ethical investigator takes into account in his assessment of 
risks. 

-- The investigator is personally responsible for the ethical conduct 
of his experiments. 

20-17 



-- This responsibility extends to assistants and colleagues. 

— The investigator must secure the subject's informed consent to 
participate. 

-- Deception if used should be undone at the end of the experiment. 

-- Participants may not be coerced into participation and must be 
free to drop out at any point. 

-- Participants must understand the procedures to be employed. 

-- Subjects must be protected from physical harm and mental stress. 

-- The responsibility to correct undesirable effects of the experiment 
remains with the investigator after the experiment is over. 

-- Misconceptions and misunderstandings arising in the experiment 
must be removed. 

-- Complete confidentiality is required of all information obtained 
about participants. 

This list provides the responsible investigator with a series of 
questions to ask himself about the treatment of participants in any 
experiment: "Do I have the subjects' informed consent?", "Is deception 

necessary?", and most importantly, "Have I and my colleagues and my 

assistants done everything we can to protect the welfare of our subjects 
as is required by the ethical code." Only if such an analysis of risks 
to the participant yields satisfactory answers does the ethical 
investigator proceed. 

Subjects' assessment of risks. As was mentioned in an earlier 
section subjects in research typically are in no position to evaluate 
the costs of their participation until they have had the experience. Then 
it is too late by definition for this experience to contribute to a 



20-18 



decision about the ethical aspects of the research. This state of affairs 
does suggest one important point to make. Most investigations require a 
certain amount of pilot work. The few individuals who participate at this 
stage of the research might well be asked about their reactions to the 
experimental procedures for purposes of uncovering risks that may have 
escaped the analysis described above. Procedures could then be modified 
in directions designed to minimize these newly recognized risks 

Institutional Review Groups. At least on university campuses the 
existence of Institutional Review Groups is an important scientific fact 
of life these days. In the typical case these groups are in a position 
to assess risks in the cases where the investigator may not be a "reasonable 
person" because of his investment in his research or an insensitivity to 
the feelings of subjects. In most situations even slightly sensitive 
projects come to these groups. In my own experience they almost always 
detect any problems the investigator has overlooked. 
Summary and Comment 

I have proposed that the risk/benefit equation be rewritten in 

realistic terms: 

Amount of Knowledge to Come from Research 
Risks as seen by Reasonable People 

The major advantage of the rewritten equation is that it removes the 

necessity for making an impossible calculation. As we have seen many 

times now risks and benefits in the usual meaning of those terms present 

insurmountable obstacles to quantification. The terms as redefined seem 

to be susceptible to statements involving judgments of at least more and 

less. Although this is not exactly a precise formula for ethical decision 



20-19 



making I think that it is a step in the right direction. 

I have left the main responsibility for making the risk/benefit 
calculation up to the investigator and have placed upon him two main 
obligations: 1 ) to be as sure as one possibly can that his research will 
lead to an advance in knowledge and 2) to assess the costs to participate 
and to minimize them. 

I have rejected the alternative of assessing risks for individual 
participants because of the practical impossibility of the task and because 
such an assessment would surely invade the potential participant's privacy 
and would potentially lead to other ethical risks. Although the regress 
entered into in that way might not be infinite, the interesting thought 
does occur that, once started on such a process of detailing risks, it 
might be difficult to recognize the proper stopping place. 

I have noted that the input of subjects might play a role in the 
investigator's assessment of risks. One could add to this point that 
taking such a view of the subject's participation might foster a sounder 
relationship than sometimes now exists between experimenter and participant. 

Finally I have noted that Institutional Review Groups protect the 
subject's welfare at another level. 

The Ultimate Risk/Benefit Equation 

As a way of bringing this essay to an end I would like to return to 
the point with which I began and to direct the reader's attention to 
what might be called "the ultimate risk/benefit equation." In this equation 
the benefits are those which research will contribute to the solutions 
of the big problems of society. The risks are those entailed by not 
doing research at all and trusting to common sense and accumulated wisdom 



20-20 



to solve these problems. It seems to me that this alternative can be 
dealt with quickly. Common sense, intuition and accumulated wisdom have 
been with us forever. They seem to me to be as responsible as anything is 
for the sorry state the world is in now— where the disappearance of Man 
as a species is more than a fanciful abstract possibility. The time has 
come (if it has not passed) to turn to other sources of guidance and the 
only reasonable alternative is the knowledge provided by research. 






20-21 



References 

Berkun, M. M., Bialek, H. M., Kern, R. P. and Yagi, K. Experimental 

Studies of Psychological Stress in Man. Psychological Monographs , 

1962, 76, 534 (whole No. 15). 
Carroll, Lewis. Alice's Adventures in Wonderland . New York: Random 

House, 1946, pp. 44-45. 
Cook, S. W. , Hicks, L. H., Kimble, G. A., McGuire, W. T., Schoggen, P. H. 

and Smith, M. B. Ethical Principles in the Conduct of Research with 

Human Participants . Washington, D.C.: American Psychological 

Association, 1973. 
Hull, C. L. A Behavior System . New Haven: Yale University Press, 1952. 
Hull, C. L. A Functional Interpretation of the Conditioned Reflexes. 

Psychological Review , 1929, 36, 498-511. 
Kimble, G. A., Shock Intensity and Avoidance Learning. Journal of 

Comparative and Physiological Psychology , 1955, 48, 281-284. 
Kimble, G. A., Garmezy, N. and Zigler, E. Principles of General 

Psychology , 4th Edition . New York: Ronald, 1974. 
Levine, R. J. The Role of Assessment of Risk-Benefit Criteria in the 

Determination of the Appropriateness of Research Involving 

Human Subjects. Preliminary draft. Unpublished manuscript, 1975. 
Milgram, S. Some Conditions of Obedience and Disobedience to Authority. 

Human Relations , 1965, 18, 57-75. 
Smart, M. S. and Smart, R. Children: Development and Relationships . 

New York: Macmillan, 1967. 



20-22 



21 



A PHILOSOPHICAL PERSPECTIVE ON THE ASSESSMENT OF RISK-BENEFIT 

CRITERIA IN CONNECTION WITH RESEARCH 

INVOLVING HUMAN SUBJECTS 



Maurice Natanson, Ph.D. 



A PHILOSOPHICAL PERSPECTIVE ON THE ASSESSMENT OF RISK-BENEFIT 
CRITERIA IN CONNECTION WITH RESEARCH INVOLVING HUMAN SUBJECTS 

by Maurice Natanson 

"The doctor said that so-and-so 
indicated that there was so-and-so 
inside the patient, but if the 
investigation of so-and-so did not 
confirm this, then he must assume 
that and that. If he assumed that 
and that, then... and so on. To 
Ivan Ilych only one question was 
important: was his case serious or 
not? But the doctor ignored that 
inappropriate question. From his 
point of view it was not the one 
under consideration, the real 
question was to decide between a 
floating kidney, chronic catarrh, 
or appendicitis. It was not a 
question of Ivan Ilych 1 s life or 
death, but one between a floating 
kidney and appendicitis. And that 
question the doctor solved bril- 
liantly, as it seemed to Ivan Ilych, 
in favour of the appendix, with the 
reservation that should an exami- 
nation of the urine give fresh 
indications the matter would be 
reconsidered." 

— Leo Tolstoy: The Death of 
Ivan Ilych 



I. On the Relationship between Philosophy and Science 

When philosophers discuss medical matters, there is a 
legitimate need to delimit their professional competence, for 
even when the issues involve ethical problems, it is by no 
means obvious that the philosopher is on solid ground in his 
inquiry. Just as the physician faces subtle and complex ethical 
difficulties in making some of his most important medical 

21-1 



decisions, so the philosopher confronts recalcitrant, technical 
medical issues which frequently transcend his training and 
understanding. The philosopher must rely largely on a reading 
of the literature on the subject; direct clinical experience is 
denied him. And, of course, what used to be called "recent 
advances" in medicine now give way to new fields of specialisation. 
A few words (such as "genetic engineering") herald the ambiguities 
of a new age. The philosopher who yesterday may have been con- 
cerned about occasional pocAets of scientific ignorance, today 
is overwhelmed by entire wardrobes of illiteracy. Elsewhere I 
have briefly discussed some aspects of the ne~d for the training 
of individuals who have so.^e comprehension of both philosophy 
and medicine. That problem is not before us now, but its impli- 
cations cannot be wholly overlooked. The fact is that what the 
philosophers know about ethics and what the scientists know about 
medicine seldom come together in a way which is satisfactory 
for either side, let alone for the social good. But the problematic 
relationship between philosophy and medicine may be seen as part 
of a more general rubric: the interdependence of philosophy and 
knowledge. 

P.ather than viewing philosophy and science as disparate 
disciplines which can be brought together only in artificial and 
cursory ways, it is possible to approach them as integral in 
their inner signification, as intimately related facets of the 
unitary reality of knowledge. Aerleau-Fonty presents such a 
conception of unity: 



21-2 



"The segregation we are fighting against is no less 
harmful to philosophy than to the development of 
scientific knowledge. How could any philosopher aware 
of the philosophical tradition seriously propose to 
forbid philosophy to have anything to do with science? 
For after all the philosopher always thinks about 
something ; about the square traced in the sand, about 
the ass, the horse, and the mule, about the cubic 
foot of size, about cinnabar, the Roman State, and the 
hand burying itself in the iron filings. The philosopher 
thinks about his experience and his world. Except by 
decree, how could he be given the right to forget what 
science says about this same experience and world? 
Under the collective noun 'science 1 there is nothing 
other than a systematic handling and a methodical use — 
narrower and broader, more and less discerning — of this 
same experience which begins with our first perception. 
Science is a set of means of perceiving, imagining, and, 
in short, living which are oriented toward the same 
truth that our first experiences establish an urgent 
inner need for. Science may indeed purchase its exact- 
ness at the price of schematization. But the remedy in 
this case is to confront it with an integral experience, 
not to oppose it to philosophical knowledge come from 
who knows where. "2 

In these terms, the physician who is making a decision regarding 
the life of his patient, the experimentalist who is seeking con- 
sent from a subject for a procedure which entails serious risk 
to that individual, the lawyer or governmental agent or advisor 
who is charged with the task of formulating codes for ethical 
conduct on the part of researchers which will assure approp- 
riate protection of subjects for experimentation — all are 
tacitly involved in philosophical work. In addition to their 
connection with ethical matters, they are bound to try to 
appreciate the systemic unity of the domains of knowledge in 
which they operate. Philosophy is not something added to the 
recipe for knowledge; it is inevitably part of any effort to 
comprehend human experience. In this view, science and philosophy 
are both located within the unitary world which is experienced 
by all of us. 



21-3 



The point with which I am concerned is that ethics and 
ethical considerations cannot be extirpated from the corpus of 
philosophy in order to become useful to the scientist. More 
strongly stated, if ethical systems or judgments are extracted 
for specific scientific purposes, they may. perhaps serve as 
heuristic guides for inquiry, but their full force will be 
diluted, if not destroyed. In my judgment, ethics is rooted in 
the soil of philosophy but cannot be handled in the way in 
which nurserymen secure trees for replanting. Ethical problems 
are fundamentally tied to conceptions of Man, of the human 
reality . We face an ambivalent situation with regard to the 
ethical aspect of medical experimentation on human beings 
because the physician, the scientist, and even the lawyer are 
apt to turn to ethics in the narrower rather than broader 
sense, i.e,, they are searching for specific recommendations of 
what is ethical in a context whose basic moral nature is defined 
by the study of Man. I do not think that the needs of the 
researcher and of the social order which seeks to protect the 
individual can be served by divorcing ethics from philosophical 
anthropology — the effort to respond to the question -Vhat is Man? 
In fine, those analyses which are most likely to illuminate the 
underlying moral issues in experimentation on human beings are 
least likely to be the ones which offer concrete definitions, 
propositions, and calculi built out of such propositions in order 
to assist the formulator of ethico-legal codes. The paradox is 
that the more specific the ethical recommendation, the less 
chance there is for advancing the development of those primordial 



21-4 



philosophical analyses which can tell us something significant 
and lasting about ourselves. 

The philosophical perspective from which I am writing is 
that of phenomenology and existentialism. More specifically, 
my fundamental approach to the problems which form the substance 
of this paper is indebted to the phenomenology of Edmund Husserl 
and Alfred Schutz and to the existential thought of Jean-Paul 
Sartre. I will avoid any attempt to summarize the essential 
doctrines of these thinkers, but a few words about their theoret- 
ical enterprise may prove useful to the reader. Husserl, Schutz, 
and Sartre disagree about important matters, but they are united 
in their concern with Man as the human reality , with Man as a 
being whose consciousnesss helps to build the microcosm in 
which he lives, and with Man as situated in the reality of daily 
life. Husserl speaks of the cardinal importance of the "Life-world," 
the stratum of mundane experience within which we locate our 
perceptual experience, our values, and our action. Schutz 
stresses the typified character of everyday existence, the 
projects of action through which ordinary human beings interpret 
their own and each other's meaning in the traffic of daily life . 
Sartre emphasizes the notion of situation itself. He writes: 

"For us, man is defined first of all as a being 'in a 
situation. • That means that he forms a synthetic 
whole with his situation —biological, economic, 
political, cultural, etc. He cannot be distinguished 
from his situation, for it forms him and decides his 
possibilities; but, inversely, it is he who gives it 
meaning by making his choices within it and by it. 
To be in a situation, as we see it, is to choose oneself 
in a situation, and men differ from one another in 
their situati ns and also in the choices they themselves 
make of themselves. What men have in common is not a 



21-5 



•nature' but a condition, that is, an ensemble of limits 
and restrictions: the inevitability of death, the 
necessity of working for a living , of living in a world 
already inhabited by other men. Fundamentally this con — 
dition is nothing more than the basic human situation, 
or if you prefer, the ensemble of abstract character- 
istics common to all situations. "3 

In terms of the Life-world, man in daily life understands 
or misunderstands his situation in concrete ways, has a lucid 
or opaque sense of his own interests, and carries with him the 
resources of a sometimes acute and sometimes baffled intelligence. 
Yet it is within the range of those talents and debilities that 
he is compelled to construct and interpret the meaning of his 
experience. Scientific models of explanation of human conduct 
are abstractions of a very restricted and specialized sort which, 
so phenomenologists believe, must attend closely to and be 
responsive to the naive models of interpretation and action which 
common-sense human beings build out of their insight into and 
bewilderment with the materials of their own existence. In the 
realm of problems of risk and benefit in research and experimen- 
tation on human subjects, the resources and needs of the Life-world 
must not only be respected but must be studied in the most 
searching fashion, for what happens to all of us, ordinary men 
and women and physicians and researchers alike, remains rooted 
in mundane life and, ultimately, must be interpreted and evaluated 
by the categories of mundane rather than scientific experience. 



*l-6 



II. The Concepts of "Risk" and "Benefit" 

The notion of the Life-world provides a point of access to 
the understanding of risk and benefit because it makes it poss- 
ible to distinguish between risk and benefit as quantifiable 
terms and risk and benefit as primary and endemic features of 
everyday experience. Of course, risk and benefit have an enor- 
mous range of reference. At one end of the spectrum, risk is a 
commonplace feature of the most taken for granted acts. As one 

writer points out, "...the baby could suffer fatal injury if 

4 

dropped while being weighed."^ We shall be concerned with more 

substantial risk than that. At the same time, however, it must 
be recognized that whereas the formulation of risk (and benefit 
as well) is the professional responsibility of the investigator, 
whatever the formulation turns out to be must be interpreted by 
the subject or patient. What I am concerned with here is not 
simply the question of translating the language of the scientist 
into that of the layman. Presupposed in all such translation is 
the conceptual stance of the ordinary individual, the categories 
through which he comprehends the elements of his experience and 
their implications for his well being. Were the essential problem 
of "informed consent" just a matter of the effective restatement 
of technical language into straightf orward, everyday language, 
the difficulties arising out of securing informed consent would 
disappear rather quickly. The difficulties are persistent because 
they are functions of something other than the mere efficacy of 
translation. In the instance of risk and benefit, the translation 



21-7 



of those terms and their implications into the Life-world of the 
patient or subject entails a primordial interpretation on the 
part of the individual who is doing the risking or expects to be 
benefited or have others benefited. 

There are axioms of mother wit: there is no absolute 
assurance that what is reasonably expected will necessarily follow 
from an experiment; traditional and medically conservative 
measures may nevertheless produce undesirable effects in a 
particular case; benefit sought from a given procedure may carry 
along with it undesirable side effects; benefit to others may 
prove to be illusory or even detrimental; the relationship between 
what may be good for the individual and what may be good for 
society is generally uncertain, unstable, and revocable. 'Whether 
or not the individual formulates such axioms in the way I have, 
their import is naively grasped by everyone who wishes to avoid 
trouble, to preserve good health, and to survive under optimal 
circumstances. The axioms of mother wit are implicit assumptions 
which are part .of the fabric of common sense. To be sure, there 
are some who are ignorant not only of elementary features of 
human anatomy and physiology but who are too timid to ask their 
doctors for more information. Not long ago, I read in a newspaper 
medical column a letter by a young man whose physician had told 
him that he had a spleen. How serious was that? the author of 
the letter wanted to know. There are also those who do not want 
to be told what the case is, what the possible dangers are, what 
the full implications of an experimental procedure might be. It 
is not possible here to proceed casuistically. Instead, I propose 



21-8 



to turn directly to the concepts of risk and benefit without 
losing sight of the notions of the Life-world and of situation. 

A. Risk: 

It is necessary to distinguish between risk for the indi- 
vidual undergoing treatment by his own physician or surgeon and 
risk for the individual who is being asked to participate 
voluntarily in an experiment in which he is to be a subject. The 
more pressing problems for our consideration appear to fall in 

the second classification, but the complex connection in therapy 

t 
and experimentation between the two categories must be considered. - 

To begin with, it is not unusual for writers on this subject to 

point to the problematic nature of the treatment-experiment 

relationship. As Maurice B. Visscher says, "...it is difficult 

to draw the line between what is experiment and what might be 

called medical treatment." Or as Herrman L. Blumgart puts it, 

"3very time a physician administers a drug to a patient, he is in 

7 

a sense performing an experiment." But the same circumstance does 

not pertain in the distinction between, on the one hand, experi- 
menting on one's patient for purposes directly related to trying 
to cure or alleviate his specific medical problems at a time 
when such therapeutic efforts are deemed necessary by the 
physician and, on the other hand, asking an individual to partici- 
pate in an experiment from which he will not personally benefit 
in medical terms. Otto S. G-uttentag recommends that "...a climate 
of spiritual values should be fostered in which experiments done 



21-9 



not for the immediate good of the experimental subject but for 
the welfare of mankind would be performed only by experimenters 
who are not simultaneously responsible for the clinical care of 

Q 

these experimental subjects." By separating physician in charge 
of the care of his patient from experimenter in control of his 
subject, it is hoped that the conflict of therapeutic-experimental 
interest may be avoided, though something of a paradox is 
generated in the process: the person best able to care for his 
patient is the physician; the person charged with the welfare of 

his subject — the experimenter — is not primarily oriented toward 

q 
caring for his subject. 

The paradox we have pointed to goes beyond the question of 

whether the clinician and the experimenter should have different 

roles with respect to patient and subject. The broader issue is 

the relationship between care , which is committed to the welfare 

of a concrete human being who is ill by a fellow human being, the 

physician, and treatment , which may indeed be all that is offered 

by some physicians whose interest in their patients is rather 

limited but which , in an experimental context, is tied to a 

different goal: tie appropriate completion of the experiment. 

The risk to the experimental subject is far greater than the risk 

to the patient. Obviously, the degree of risk may, empirically, 

be reversed in the two situations under certain circumstances. The 

patient may be risking his life in a therapeutic procedure 

involving dangerous surgery, whereas the experimental subject 

may be submitting to routine and completely safe testing having 

to do with moderate changes in diet for a normal individual. 



21-10 



Indeed, the experimental subject may be part of a control group 
to whom nothing is done. But the paradox remains: when the patient 
becomes the subject, he needs more rather than less care , yet the 
risk of receiving that care from the experimenter is substantial. 
?or the experimental subject who is not a patient, the risk is 
even greater. What , exactly, is risked? Most simply, that the 
well being of the subject is not the dominant concern of the 
experimenter, who may be more interested in the intellectual- 
scientific challenge of the experimental work itself, who may be 
strongly motivated by the expectation of publishing his results 
in the hope of advancing his professional career, or who may be 
unduly influenced by his colleagues in an experimental team. Such 
desires and pressures are not in themselves wicked and unethical; 
they are implicit hazards, however, for the subject who may assume 
that the experimenter places the well-being of his subject above 
personal gain. 

B. Benefit: 

In the case of the patient, benefit is directly correlated 
with the treatment of his illness. That is hardly to say that 
benefit is assured; it is only to say that what is being risked 
is being risked for the possibility of individual betterment. 
V/hen it comes to the subject, however, benefit is correlated with 
a larger domain: those afflicted with a certain disease, those who 
would benefit if an effective and safe vaccine were developed for 
the innoculation of those who might develop a certain disease, 



21 -11 



those who might benefit indirectly from knowledge gained in 

research on one medical problem which has or may prove to have 

relevance for another problem. Ultimately, society itself is 

said to benefit from the advance of medical knowledge. We shall 

say something about society and the individual shortly, but for 

the moment, it might be suggested that the concept of benefit is 

vague and fugitive to the subject in many cases and may serve as 

a shield not only to the experimenter whose medical ethics are 

questionable but also to the ethical experimenter who may be 

unwilling to face the full implications of a procedure which 

legitimates risking harm to one group of individuals for the sake 

of another group of individuals. Yet itb would be unacceptable to 

reduce the meaning of benefit to patient-benefit alone. The 

decisive consideration is that benefit and risk be viewed in 

integral fashion. That means that what benefits human beings 

usually carries with it risk, and that risk which is deemed 

"minimal" or "acceptable" nevertheless may mean severe suffering 

or death to some. Chauncey D. Leake writes: 

"There is no absolute safe and effective chemical agent 
that may be used for biological effects in humans, not 
even common table salt. The Gaussian distribution curve 
inevitably fits any drug, if it is used on enough people: 
in a few there may be no effect at all from the same 
quantitative dose that may produce serious injury or 
death in some. Here, social welfare must be considered, 
as when the Canadian authorities went ahead with mass 
protection against polio, using oral vaccine, although 
four people out of some 2,000,000 met death ascribed 
to it. Even a hedonistic ethic would take the chance of 
1 in 500, 000. "10 

A hedonistic ethic might very well accept the risk of 

1 in 500,000, but we are left with the question of whether to 

accept a hedonistic ethic. In the case of the polio vaccine, 



21-12 



it is essential to recognize the nature and scope of the suffering 
and incapacitation of polio victims, the widespread awareness of 
the character of the disease, and the likelihood of permanently 
eliminating the devastating effect of polio on thousands of 
people. In assessing benefit, in this instance, there is a clear 
recognition of the quality and quantity of suffering and sufferers 
in the past. Benefit is directly related to the history of concrete 
and widespread anguish of victims and of those that love them. 
In the absence of such a history, it is prudent to reflect more 
thoroughly on the problem of justifying the death of some, 
however few, for the sake of the health of the many, no matter 
how many. The reasons which are accepted for justifying the 
risking of the life of the few are of critical importance in 
justifying the integrity and morality of the social order. In each 
case, those reasons must be intimately associated with the reality 
of suffering and the reality of sufferers. Furthermore, those 
reasons must be explained to both risk-takers and to those who desire 
them to take risks, to ordinary people and to physicians, experi- 
menters, and, perhaps most important of all, to medical students 
and graduate students going into medical research. Not only 
giving reasons but defending those reasons in the context of the 
social order is essential to the protection and honor of all 
those who are involved in any way in experimentation on human beings. 

If a casuistic analysis of risk and benefit problems lies 
beyond the scope of this paper, it would seem that all that can 
be recommended consists in generalizations which falter before 
the determination of concrete cases. Earlier, I pointed to the 



21-13 



desirability of a phenomenological-existential approach to the 
problems before us. V/hat help can such an approach provide if 
specific determinations in concrete cases can only be loosely- 
guided by general recommendations? In fact, the central diffi- 
culty in trying to find a way in the thickets of risk-benefit 
problems is that where detailed and highly specified protocols 
are issued, the physician and the experimenter who are highly 
ethical individuals may well be compelled, by constraint of law, 
to circumscribe their care and treatment of the patient and 
subject to the medical disadvantage of both risk-taker and the 
social good, whereas a more open and flexible set of guidelines 
may be misused in such a way as to injure or endanger the well 
being of the patient or subject and, in turn, threaten the moral 
fabric of society. Faced with a somewhat analogous paradox, the 
law tends to favor the more generalizing alternative. According 
to Paul A. Freund: 

"As part of its conservatism, the law tends to generalize 
on the basis of a balance of risks. If, for example, it 
is thought that there is a predominant risk of perjury 
in claims that oral contracts have been made, the law 
enacts a statute of frauds requiring as a general rule, 
as an invariable rule, that there be a writing for 
important contracts, even though in some cases there 
is created a counterrisk that thereby soi.e genuine oral 
agreements will not be recognized. If there is a pre- 
dominant risk of suppressing information and criticism 
by enjoining the publication of allegedly libelous 
matter the law will make a general rule of refusal to 
enjoin, even though there is a count ervailifeg risk 
that some actually defamatory matter will thereby be 
allowed to circulate. The lav/ takes refuge in general 
rules as metaphysics resorts to absolutes. "11 

In the case of medical practice, viewed from an ethical perspective, 

it is not evident that the analogy holds true. The law might 

require a written consent form in cases of experimentation, but 



21-14 



it is not clear what would constitute an "important" case, nor 
is it obvious what would be accepted as a "predominant" risk. 
Paradigms for "important" and "predominant" can usually be 
provided; but individual instances are uncertain and boundary 
cases are ambiguous. In any event, what is being risked and what 
is hoped for as benefit may be uncertain in the minds of both 
subject and experimenter. The paradox of concreteness and 
generalization continues to bedevil our discussion. But paradox 
need not lead to demoralization or to ethical paralysis; rather, 
it is the inescapable medium through which the tension between 
concreteness and generalization finds its expression. 



III. The Needs of Society and the Rights of the Individual 

The contrast between society and the individual may be 
understood as the contrast between the individual and other 
individuals. The common good cannot be divorced from the good 
of individuals. But the good of individuals presupposes a 
recognition of values which transcend the individual — let us 
call them moral values- — at the same time that they define the 
character of society. Society may embody and exemplify moral 
values, but $# does not provide a ground for the legitimation 
of morality. Socisty is "moral" to the extent that it commits 
itself to the good of the individual, a good which transcends 
the individual for the sake of the individual. A double trans- 
cendence reveals itself here: the individual is transcended 



21-15 



insofar as moral values go beyond any one person* s interests 

and needs, and society is transcended to the extent that the 

moral values it represents are not themselves justified on the 

sole grounds of the common good. During an epidemic, physicians 

and government officials have the right to segregate individuals 

who are likely to contaminate others, but that right (and 

obligation) does not carry with it an authorization to destroy 

those who endanger the lives of others. Certain rights of the 

contagious minority must be respected by the endangered majority. 

The moral value at issue here is that human beings are, by nature 

of their humanity, committed to the care of the afflicted. Should 

there be a situation in which the only way to protect the rights 

of the unafflicted is by destroying the afflicted, the social 

order would be challenged in its own moral inwardness. Kor is the 

moral tension eased if the minority involved is a tiny one. Hans 

Jonas writes: 

"Society, in a subtler sense, cannot 'afford' a single 
miscarriage of justice, a single inequity in the 
dispensation of its laws, the violation of the rights 
of even the tiniest minority, because these undermine 
the moral basis on which society's existence rests. 
Nor can it, for a similar reason, afford the absence 
or atrophy in its midst of compassion ~nd of the effort 
to alleviate suffering ■ — be it widespread or rare — one 
form of which is the effort to conquer disease of any 
kind, whether 'socially' significant (by reason of 
number) or not. And in short, socioty cannot afford the 
absence among its members of virtue with its readiness 
to sacrifice beyond defined duty." 12 

The rights and obligations of society toward its members 

and future members (and past members as well) are limited by 

its implicit as well as explicit commitment to the good of the 

concrete individual who seeks his physician's care. The physician 



21-16 



honors the good of society insofar as he respecta the good of 
his patient. Apart from situations of pestilence, widespread 
starvation, natural disasters, or catastrophes of war where, 
for the time of the emergency, traditional commitments may be 
qualified or suspended, the needs of the patient have primacy. 
No equivalent primacy exists, in ethical terms, from the stand- 
point of society. It is misleading to emphasize the good of 
future members of society at , the expense of present members. 
As Jonas puts it, "our descendents have a right to be left an 
unplundered planet; they do not have a right to new miracle 
cures. We have sinned against them if by our doing we have 
destroyed their inheritance...; we have not sinned against them 
if by the time they come around arthritis has not yet been 
conquered (unless by sheer neglect)." * But it is evident that 
all physicians do not subscribe to this view. It is further 
evident that physicians who are fundamentally involved in 
research may interpret the society-individual relationship in 
a different way than physicians who are primarily concerned 
with caring for their patients. When the two overlapping 
categories coincide, some interesting problems arise. Renee C. 
Pox has presented a thorough description of the difficulties 
experienced by one team of research-physicians in determining 
the limits of ethical medical conduct in treating patient-subjects, 
She writes: 

"The Metabolic Group was also engaged in a considerable 
amount of research which they undertook primarily to 
advance general medical knowledge, and only secondarily 
or incidentally because they thought it might be helpful 
to patients who consented to act as their subjecta. 



21-17 



The members of the Group 'hoped' that the patients who 
participated in these experiments might gain some 
clinical benefit from doing so, and they were pleased 
when this happened. But to the limited extent that 
medical ethics allowed them to do so, they subordinated 
their clinical desire to serve the immediate interests 
of the particular patients involved in such experiments, 
and gave priority to the more long-range, impersonal 
research task of acquiring information that might be of 
general value to medical science. "14 

It is not easy to reconcile medical intervention done with a 
bare minimum of ethicality with serving the good of society. 
It would seem that such intervention has only a limited connec- 
tion with the welfare of the patient-subject but a powerful 
relationship to the abstract development of medical knowledge. 

I do not think that an ethical balance can be struck between 
the needs and rights of the individual and the needs and rights 
of society if what is relinquished in the former is the trust 
that tacitly undergirds the relationship between patient and 
physician or if what is compromised in the latter is the morality 
which is based on the inviolability of human freedom. In fact, 
the very notion of "balance" in this context is unacceptable if 
it leads to a "give and take," a "more or less" of qualitative 
human assurances which are irreducible and, in principle, incapable 
of being negotiated in terms of a quantitative calculus. One such 
human assurance is the patient's right to expect that anything 
done for him is being done in his interest, as that interest is 
interpreted by the physician who cares for him. In the case of 
the subject-experimenter relationship, the fundamental human 
assurance is that the most honest, non-self serving effort has been 
made in a well-designed experiment to inform the subject clearly 
and with appropriate fullness about what will go on in the 



21-18 



experiment, about what known dangers there may be, about the 

possible injurious side effects that are deemed plausible, or 

about the vaguer risks which are being taken by the subject, 

given the status of what is not known about the possible results 

of the procedure at issue. The words "appropriate fullness" may 

seem to beg the question. The acceptability of the phrase depends 

ultimately on the honesty of the person who seeks "informed 

consent" from the subject. "Honesty" hardly implies omniscience; 

it does imply that the subject's good is not given secondary 

consideration merely because he has volunteered for the job. In 

the case of patient-subjects, appropriate fullness demands of 

the physician-experimenter that serious risk be taken only when 

the patient-subject's welfare is of primary concern. As Henry K. 

Beecher states: 

"Considerable or even great risk is not necessarily an 
absolute injunction against acceptance by the investigator 
or the subject. Indeed, some procedures have been assoc- 
iated with a fatal outcome and yet may still provide 
advantages great enough to outweigh the hazard involved. 
One cannot forbid what may be a perilous procedure on the 
basis of unknown risk alone. It seems to me, however, 
that great risk should usually be accepted only if the 
subject promises to profit directly from it. "15 

We are still left with the category of fully informed, 

consenting subjects (including some patient-subjects) who volunteer 

for potentially hazardous experimentation from which it is unlikely 

that they can derive any personal medical benefit. Granted the 

-1 c 

problematic status of the notion of "informed consent," it is 
still possible to say that among the rights of individuals is the 
right to serve as a volunteer in an experim2nt which may benefit 
others. However, society is obliged to guard against abuse by 



21-19 



experimenters of the rights and needs of those who are most 
vulnerable to unethical conduct by those doing research: the 
sick, the old, the retarded or mentally ill, children, prisoners, 
the impoverished, and those whom life has neglected or betrayed. 
Perhaps it is not really possible to arrive at an absolute state- 
ment of the sufficient conditions for fully informed consent, but 

it is possible to state more comprehensively the necessary con- 

17 

ditions which must be met. luedical codes, guidelines, and 

protocols already exist which serve to protect both subjects and 

experimenters, but the inevitable paradox of the concrete and the 

abstract arises once it is asked how a general recommendation or 

requirement can be applied in a specific case, Henry K. Beecher 

warns: 

"There is the disturbing and widespread myth that 'codes' 
(all of which emphasize, above all else, consent) will 
provide some kind of security, while there is value, 
doubtless, to be gained from their examination as guides 
to the thinking of others on the subject, the reality is 
that any rigid adherence to codes can provide a dangerous 
trap: no two situations are alike; it is impossible to 
spell out all contingencies in codes. 7/hen an accident 
occurs, in the course of experimentation, it will be easy 
for the prosecution to show failure to comply fully, and 
an endless vista of legal actions opens up. It is a 
curious thing that lawyers for even the greatest insti- 
tutions are much more likely, in my experience, to cripple 
themselves and their institutions with inevitably imperfect 
codes than are the investigators involved, who usually 
understand the pitfalls represented by the codes. Security 
rests with the responsible investigator who will refer 
difficult decisions to his peers. "18 

nevertheless, such documents as the Nuremberg Code and the 

Declaration of Helsinki do more than provide a guide "to the 

thinking of others on the subject"; they embody and represent 

commitments to moral value winch make it possible for both 

investigators _nd subjects to recognize and affirm in these 



21-20 



formulations the conditions of treatment of and concern for 
fellow human beings. The ideality of the codes does not 
detract from their primary purpose. 

Unethical, irresponsible, or incompetent investigators 
(or those who choose to exceed their domain of competence) 
cannot be legislated out of existence, but they can be con- 
strained by the criticism of those who are ethical, responsible, 
and expert. It has been pointed out that editors of medical 
journals have a particular responsibility to exert caution in 

evaluating articles submitted for publication which are based 

no 
on data which were unethically obtained. * Guch caution should 

make it more difficult for those whose announced plans for 

experimentation were found acceptable by their colleagues and 

superiors but whose actual practice exceeded ethical standards. 

I nowing in advance that important results unethically obtained 

will not be published will tend to restrain the unethical 

investigator. More difficult to cope with is the situation of 

the experimenter who does commit himself to the constraints of 

ethical practice but who finds it extraordinarily difficult at 

times to treat subjects as moral ends without denying them and 

others the utilization of perilous means. The physician-investigator 

in particular is haunted by the desire to stand by his patient 

while also honoring his commitments to the advancement of medical 

knowledge. As I have already suggested, there is no calculus 

which can substitute for determining the qualitative good of 

20 

human beings. To say that, however, is not to claim that such 

a calculus cannot be constructed; it is only to warn that any 



21-21 



calculus must ultimately be interpreted by human beings and that 
the act of interpretation presupposes qualitative factors which 
have been incorporated in the initial construction of a mathmatical 
model as well as the qualitative character of the act of interpre- 
tation itself. No doubt, a computer could be programmed in such a 
way as to pick out the best candidates for an experiment; the 
choice of those candidates, however, needs to be made by a human 
being who is morally obliged to reflect on the meaning of "best 
candidates" not only for the benefit of the experimenter but also 
for the welfare of the candidate. A computerized blood bank is an 
extraordinarily useful instrument, but it tells us nothing about 
what is morally demanded of those in charge of it. 



IV. On Dignity and Philosophical Method 

Discussions of experimentation on human beings and codes 
which seek to protect the individual against unethical conduct 
on the part of physicians and experimenters frequently stress 
the importance of honoring the dignity of the person. So, for 
example, .the Principles of medical Ethics of the American I'edical 
Association includes the following dictum: "The principal objective 

of the medical profession is to render service to humanity with 

21 

full respect for the dignity of man." Or as Herrman L. Blumgart 

expresses it, "A person has a right not only to live in dignity, 

22 

out also to die in dignity." Detween the affirmation of such 

norms _nd the reality of medical practice (which, in turn, 



21-22 



reflects the reality of societal demands and commitments) lies 
the dark terrain of actual practice: the realm of second and 
third-rate medical treatment performed by mediocre and sometimes 
incompetent staff in hospitals and offices which are often 
teeming with people whose "dignity" is of little consequence to 
those who are supposed to "care" for them. The more immediate 
concerns are being able to handle the flow of emergency cases, 
ascertain the patient's ability to pay for services to be pro- 
vided, and sustain basic services under distressing circumstances. 
Being able to attend to the patient's dignity in such conditions 
may be viewed as a luxury. In any event, the medical microcosm 
mirrors the macrocosm of society, where dignity is an ideal 
which is often subverted by bad faith. C-uido Calabresi points out: 

"Accident law indicates that our commitment to human life 
is not, in fact, so great as we say it is} that our 
commitment to life-destroying material progress and 
comfort is greater. But this fact merely accentuates 
our need to make a bow in the direction of our commitment 
to the sanctity of human life (whenever we can do so 
at. a reasonable cost)". It also accentuates our need to 
reject any societal decisions that too blatantly contradict 
this commitment. Like 'free will, ' it may be less important 
that this commitment be total than that we believe it 
to be there. 

perhaps it is for these reasons that we save the man 
trapped in the coal mine, _.fter all, the event is dramatic; 
the cost, though great, is unusual; and the effect in 
reaff inning our belief in the sanctity of human lives is 
enormous. The effect of such an act in maintaining the 
many societal values that depend on the dignity of the 
individual is worth the cost, abolishing grade crossings 
might save more lives and at a substantially smaller cost 
per life saved, but "che total cost to society would be far 
greater and the dramatic effort far less. I fear that if 
men got caught in coal mines with the perverse frequency 
with which cars run into trains at grade crossings, we 
would be loath to rescue them; it would, in the aggregate, 
cost too much. "23 

Surely, affirming the dignity of the patient is axiomatic for his 



21-23 



doctor, but unless the affirmation carries existential force 
along with it, its axiomatic status means that it is simply 
taken for granted and that its ideal or normative character 
remains distant from specific application. Jo one wishes to be 
on record as opposing the dignity of man, but approving that 
sentiment naraly caxls for much unless it requires moral practice- 
in which case, its demands are profound. 

If there is bad faith in society, it does not follow that 
there must be bad faith in individual choice. If society "chooses" 
to do something about the individual and relatively uncommon but 
dramatic case of the trapped miner rather than the more widespread 
tragedy of collisions at grade crossings, it is not because the 
dignity of one victim is more compelling than the dignity of 
anoth r victim. As Calabresi indicates: "The notion is incorrect: 
that we in some sense choose the number of people who will be 
killed in automobile accidents by choosing a market system that 
will determine how much safety is worth. The notion is only made 
plausible by a verbal trick — by using the words 'we choose' to 
describe both the ef.-'ects of the social system in which we live 
and which we tolerate, but which we cannot in fact be said to 
choose, and events as to which we can be said to exercise 
purposive choice." But in the case of experimentation, choice 
does lie with experimenter and subject. Experimentation is 
purposive choice. Accordingly, the experimenter, unlike society 
at lavge, is obliged to respect the dignity of the concrete 
human beings who come within his professional purview. Just 
what does dignity signify in this context? We have returned not 



21-24 



only to a philosophical issue but, in a way, to the philosophical 
approach which we outlined so nastily at the outset of this 
inquiry and to u he etatus of the central terms of discourse 
which have arisen in the course of our discussion. It is time 
to attend further to those problems of philosophical method which 
underly our comprehension of the nature of Han. 

when we say that it is the professional responsibility of 
the physician to care for his patient or when we say that the 
dignity of each patient must be respected, we are making trans- 
empirical recom-endations. The care provided by a physician to 
a patient may, in a narrow sense, be reviewed by others; but thai, 
only .means that services are being scrutinized. Care, as we have 
been using the word, reiers to the commitment the physician has 
made as a fellow human being to another fellow human being who is 
in need. Care in this sens, is recognized by those who are iiimed- 
iately involved in -he situation of care: physician, patient, 
and others who are truly concerned with the well-being of the 
patient. In a similar way, respect for and recognition of human 
dignity is a function of the individual relationship between 
physician and patient, .^oth care a nd dignity do not preclude 
therapeutic distance on the part of the physician; indeed, such 
distance is necessary if he is to function effectively. But 
distance does not either damage or replace devotion and 
dedication. If care and dignity are transempirical in nature, it 
does not follow that they are incomprehensible either to the 
patient, the physician, the subject, or the experimenter. To the 
contrary, care and dignity are terms whose meaning is rooted in 



21 -25 



the Life-world and whose appreciation, therefore, is available 

to ordinary men and women and children. To be treated with 

respect and decency is the wOmijon desire of all of us. To ignore 

the dignity of the person or to treat him without really caring 

for him results in human resentment. That such commonplaces are 

.recognized and affirmed by common-sense peuple is precisely the 

point of self-interpre nation v/ithin the Life-world. We recognize 

as mundane creatures that although we may be replaceable as 

organisms, our identitites as persons are not commodities. To 

care for ana respect the person has little to do morally with 

liking the individual, whatever the psychological relationship 

may be between physician and patient. Rather, care and respect 

are directed toward ohe privileged being of the person. James 

Agee writes: 

"Each is intimately connected with the bottom and the 
extremest reach of time: 

Each is composed of substances identical with the oubstance 
of all that surrounds him, both the common objects of his 
disregard, and the hot centers of stars: 

All that each person is, and experiences, and shall never 
experience, in body and in mind, all these things are 
differing expressions of himself and of one root, and are 
identical: and not one of these things nor one of these 
persons is ever quite to be duplicated, nor replaced, nor 
has it ever quite had precedent: but each is a new and 
incommunicably tender life, wounded in every breath, and 
almost as hardly killed as easily wounded: sustaining, for 
a while, without defense, the enormous assaults of the 
universe. "25, 

The same integrity between care and dignity must be retained 

or at least struggled for in the relationship between experimenter 

and subjec u . It is possible that unethical means may yield 

potentially beneficial results; it is certain, however, that the 



21-26 



deliberate choice of unethical means will damage the conditions 
of trust between human beings which constitute the realm of 
moral ends, ,/hen I said earlier that the relationship between 
risk and benefit must be viewed in integral fashion, what I 
meant was that the concrete situation of the Individual within 
the social order (including its historical dimension) commands 
fundamental respect. Understanding that situation means holding 
in tension the way in which the individual interprets the meaning 
of his own action and the manner in which society comes to self- 
recognition through the moral choices made by its agents. When 
the subject-volunteer is genuinely and thoroughly informed, when 
he knows that the considerable risk he agrees to take cannot 
benefit him personally as far as his health is concerned, and 
even when he considers himself a co-worker with the experimenter 
in the cause of general scientific knowledge, still there 
remains a moral (though not an ethical or legal) constraint on 
the investigator to do his best by a fellow human being, to 
minimize or to try to control whatever pain the subject may 
receive, and to do everything reasonably and appropriately 
possible to guard against damaging or fatal consequences. Perhaps 
the most difficult task the experimenter faces is to refuse to 
capitalize on the good will and trust of his subject for the sake 
of the experiment. I remain haunted by a fragment from a 
physician-experimenter's case history: "This amiable and cooperative 
gentleman, having previously been prostatectomized, orchidectomized 
and adrenalectomized, reenters to be nephrectomized." ' It is 
remarkable how this gentleman's amiability has managed to keep pace 



21-27 



with his cooperativeness, for his prostate, testicles, and adrenal 
glands have been removed, and he now faces the farther surgical 
loss of a kidney, what a sadly punishing history remains locked 
in that medical sentence. 

Ky conclusion can be presented in straightforward terms. An 
appreciation of the structure and texture of the Life-world, of 
the meaning of human action in mundane experience, and of the 
fundamental situatedness of persons within the world is essential 
to the determination of risk and benefit relationships in all 
experimentation on human beings. A phencmenological and 
existential approach to these problems offers a valuable point of 
access to the interpretation of the nature of medical care and 
human dignity. Any assessment of risk-benefit criteria must remain 
grounded in the moral imperatives of human beings seeking to 
fulfill themselyes in their dependence upon their fellow human 
beings. The abstractness and generality of moral claims cannot be 
reduced to quantitative models for medical decision without eroding 
the very goals of a just social order in whose name experimentation 
is carried on. Care and dignity are not euphemisms for unrealistic 
demands; they are the substance of our moral energies and the 
means through which we express the paradox-ridden career of man 
in the social world. 

Maurice Natanson 
University of California, Santa Cruz 



21 -28 



NOTES 



1. Maurice Natanson, Phenomenology. Role, and Reason (Ch. XIV, 
"Benefit and Experimentation" ) , Springfield, 111.: Charles C. 
Thomas, 1974, pp. 304-306. 



2. Maurice Merleau-Ponty, Signs (trans, with ant Introduction by 
Richard C. McCleary), Evans ton, 111.: Northwestern University 
Press, 1964, pp. 101-102. 



3. Jean-Paul Sartre, Anti-Semite and Jew (trans, by George J. 
Becker), New York: Schocken Books, 194b, pp. 59-60. 



4. Henry K. Beecher, "Medical Research and the Individual," in 
Life or Death by Edward Shils and others (Introduction by Daniel 
H. Labby), Seattle: University of Washington Press, 1968, p. 124 
■(note: Beecher attributes this observation to R. A. McCance). 



5. Jay Katz makes an important point in this connection: "Dis- 
tinctions have traditionally been drawn between research con- 
ducted by investigators on 'normal volunteers' in purely exper- 
imental settings and by therapist-investigators on 'patients' 
in treatment settings. It has generally been assumed that more 
stringent controls should be placed on investigators whose 
actions are designed to gain knowledge rather than to promote 
the subject's 'best interests." Yet in most situations it is 
difficult to draw lines between 'normal volunteers, ' 'patient- 
subjects, • and 'patients.' Moreover, the therapeutic setting 
may be the one which deserves the closer scrutiny. While a 
volunteering subject can be alert to protect his own self- 
interest, a patient-subject's need for treatment may cause him 
to overrate the benefits and underestimate the risks of a 
research technique." ( Experimentation with Human Beings , New 
York: Russell Sage Foundation, 197^, p. 727). 



6. Maurice B. Visscher, Ethical Constraints and Imperatives in 
Medical Research , Springfield, 111.: Charles C. Thomas, 1975, 
p. 64. 



7. Herrman 1. Blumgart, "The Medical framework for Viewing the 
Problem of Human Experimentation," Daedalus , Vol. 98, No. 2, 
Spring 1969, p. 253. 

21-29 



8. Otto E. Guttentag, "Ethical Problems in Human Experimentation, " 
in Ethical Issues in Pedicine (ed. by E. Fuller Torrey), Boston: 
' Little, Brown, 196b, p. 212. 



9. Br. G-uttentag is sensitive to the therapeutic imbalance which 
may result from the effort to protect the needs of the patient 
as well as the experimental subject. He writes ( ibid . , pp. 200- 
201) : "With reference to the relationship between experimenter 
and experimental subject, it is the concept of partnership be- 
tween the two, resulting from the fact of their being fellow 
human beings, that reflects our basic belief and cannot be sub- 
ordinated to any oth-r."Cf. Joseph Fletcher, Morals and Medicine , 
Boston: Beacon Press, I960, p. 37. 



10. Chauncey E. Leake, "'After-Dinner Address: Ethical Theories 
and Human Experimentation," Annals of the New York Academy of 
Sciences , Vol. 169, Art. 2, January 21, 1970, p. 394 (note: this 
is an issue on "New Eimensions in Legal ana Ethical Concepts for 
Human Hesearch"). Cf. the following statement by Henry K. Beecher: 
"...in discussing new and uncertain risk against probable benefit, 
Lord Adrian spoke of the rise in Britain of mass radiography of 
the chest. Four and a half million examinations were made in 
1957. It has been calculated that bone marrow effects of the 
radiation might possibly have added as many as 20 cases of 
leukemia in that year; yet the examinations revealed 18,000 
cases of pulmonary tuberculosis needing supervision, as well as 
thousands of other abnormalities. The 20 deaths from leukemia 
were only a remote possibility, but, Lord Adrian asks, if they 
were a certainty would they have bean too high a price to pay 
for the early detection of tuberculosis in 18,000 people? (in 
Jpdating Life and Eeath (ed. by Donald R. Cutler), Boston: 
Beacon Press, 1969, pp. 239-240. 



11. Paul A. Freund, "Ethical Problems in Human Experimentation," 
in Readings on Ethical and Social Issues in Bioraedicine (ed. by 
Richard w. Y/ertz), Englewood Cliffs, N.J. : Prentice-Hall, 1973, 
p. 38. 



12. Hans Jonas, "Philosophical Reflections on Experimenting with 
Human Subjects," Daedalus , Vol. 98, No. 2, Spring 1969, pp. 228- 
229. " "" 



13. Ibid ., pp. 230-231. 



21-30 



14. He nee C. Pox, Experiment Perilous . Glencoe, 111.: Pree Press, 
1959, p. 46 (note: The continuation of the passage from which 
this quotation is taken deserves special attention (pp. 46-48)): 
"The following are the basic principles governing research on 
human subjects which the physicians of the Metabolic G-roup were 
required to observe in order to 'conform to the ethics of the 
medical profession generally. . .and satisfy democratic morality, 
ethics and law' : 

1. Voluntary consent of the subject is absolutely essential. 
Consent must be based on knowleage and understanding of the ele- 
ments of the study and awareness of possible consequences. The 
duty of ascertaining the quality of consent rests on the individ- 
ual scientist and cannot be delegated. 

2. The experiment should seek soii.e benefit to society, unob- 
tainable by any other method. 

3. The experiment should be designed and based on prior ani- 
mal study, the natural history of the disease or problem and 
other data so that anticipated results may justify the action 
taken. 

4. It should be conducted to avoid unnecessary physical and 
mental suffering. 

5. No experiment should be undertaken where there is reason 
to believe that death or disability will occur, except perhaps 
where the experimenter may also serve as his own subject. 

6. The degree of risk should never exceed that which the 
importance of the problem warrants. 

7. There should be preparation and adequate facilities to 
protect the subject against even remote possibility of injury, 
disability or death. 

8. Only scientifically qualified persons, exercising a high 
degree of skill and care, should conduct experiments on human 
beings. 

9. The subject should be permitted to end the experiment 
whenever he reaches a mental or physical state in which its 
continuation seems to him impossible. 

10. The investigator must be prepared to end the experiment 
if he has reason to believe that its continuation is likely to 
result in injury, disability or death. 

The physicians of the Metabolic Group were deeply committed 
to these principles and conscientiously tried to live up to them 
in the research they carried out on patients. However, like most 



21-31 



norms, the 'basic principles of human experimentation* are 
formulated on such an abstract level that they only provide 
general guides to actual behavior. Partly as a consequence, 
the physicians of the Metabolic G-roup often found it difficult 
to judge whether or not a particular experiment in which they 
were engaged 'kept within bounds' delineated by these principles. 

This was especially true of the experiments they conducted 
primarily to advance medical knowledge. The justification for 
this kind of research did not lie in its pot2ntial immediate 
value for the patients who acted as subjects. Rather, it was 
premised on the more remote, general, uncertain probability that 
its 'anticipated results. . .their humanitarian importance. . .for 
the good of society' and the chance of achieving them — would 
exceed the immediate amount of 'suffering' and 'risk' the exper- 
iment might entail. The criteria on which physicians ought to 
form such a calculus are not specified by the rules of conduct 
for clinical research. Thus, wit out many established or 'clean- 
cut' bases of judgment to guide them, the physicians of the 
Metabolic G-roup were constantly faced with the problem of trying 
to decide whether the particular experiments they were conducting 
fell within the limits of their rights as investigators, or 
whether they were overstepping those rights by subjecting the 
patients involved to more inconvenience and danger than the 
possible significance of those experiments for the 'advancement 
of health, science, and human welfare 1 seemed to warrant." 

15. Henry K. Beecher, "Medical Research and the Individual," p. 124. 



16. Henry K. Beecher says: "Again and again I think- we are 
deceiving ourselves if we think we can very often get satis- 
factorily informed consent. It's the goal toward which we strive, 
and in striving for it we get a positive value. The positive 
value is that the subject knows, because of your inquiry, that 
he is going to be the subject of an experiment. I can tell you 
hundreds of examples where t ey haven't known that they were 
subjects sometimes of deadly experiments, and so I think there 
is a value in striving toward this goal. But we are deceiving 
ourselves if we think v/e ever achieve it in ordinary circum- 
stances, in any reasonably complex situation." (in Ethical Issues 
in Biology and Medicine (ed. by Preston iVilliams), Cambridge, 
Mass, : Schenkman, 1973, p. 225. 



21-32 



17. See the material on informed consent included in Jay Katz's 
Experimentation with Human Beings . In Experiment Perilous , 
Renee C. Fox provides the following information about consent 
(p. 112): "Sometimes the Metabolic Group obtained the informal, 
spoken consent of the patients who participated in their exper- 
iments. However, for those w' ich involved a considerable amount 
of hazard and risk, they usually had the patients involved 
(or their closest of kin) fill out the following form: 

I, ___^ » hereby certify that I 

have had explained to me the details of the 
contemplated procedure and assume full responsi- 
bility for any results of such a procedure. 

S igne d 

Date 



Witnessed 



Putting the patient in possession of technical information 
not only protects his welfare; it also fulfills the moral pre- 
scriptions of science, and, in so doing, helps to perpetuate 
and give momentum to scientific investigation as an institution. 
There is evidence to indicate that when these moral precepts are 
violated, scientific creativity is impaired." 



18. Henry K. Beecher, "Consent in Clinical Experimentation— 
Myth and Reality," in Experimentation with Human Beings , p. 583. 



19. See Henry L. Beecher, "Ledical Research :<.nd the Individual," 
p. 150. 



20. Robert J. Levine is right in hesitating to recommend the use 
of mathematical models in determining risk-benefit relationships, 
In addition to the difficulty he points out in assigning a weight 
or probability to the experience of pain, there is the question 
of how a mathematical model, once established, can interpret or 
help us to interpret the concrete situation of the patient or 
subject in a world defined not only by the experimenter but, in 
the first and last instance, by the patient himself. See Levine *s 
manuscript of October 27, 1975 on "The Role of Risk-Benefit 
Criteria in the Determination of the Appropriateness of Research 
involving Human Subjects" (prepared for The national Commission 
for the Protection of Human Subjects of Biomedical and Behavioral 
Research) . 

21-33 



21. Quoted by Richard C. Allen in headings in Law and Psychiatry 
(ed. by Richard C. Allen, Elyce Zenoff Ferster, :nd Jesse G. Rubin), 
Baltimore: The Johns Hopkins Press, 1968, p. ix. 



22. Herrman L. Blumgart, ""The Medical FrameworK for Viewing the 
Problem of Human Experimentation," p. 272. 



23. Guido Calabresi, "Reflections on Medical Experimentation in 
Humans," Daedalus . Vol. 98, No. 2, Spring 1969, pp. 388^389. 



24. Ibid ., pp. 391-392. 



25. James Agee and ./alker Evans, Let Us How Praise Eamous hen , 
Boston: Houghton Mifflin, 1941, p. 56.' 

26. Renee C. Fox, Experiment Perilous , p. 44. 






21-34 



22 

ESSAY ON SOME PROBLEMS OF RISK-BENEFIT 
ANALYSIS IN CLINICAL PHARMACOLOGY 

Lawrence C. Raisz, M.D. 



Risk-Benefit Analysis is an extension of common sense decision 
making. Faced with any alternative, a rational individual will determine 
what the advantages and disadvantages of each particular course might 
be and then proceed. When such an analysis is extended from single 
individuals to physicians as investigators and patients as experimental 
subjects, and particularly when the analysis involves matters of life or 
death, health or well being and legal sanction or disapproval, the analysis 
becomes more difficult and common sense will not suffice. 

This paper will focus on the particular case of the development and 
evaluation of drugs to be used in the treatment of human disease. The 
following special problems must be considered: 

1) In the first phase of drug development in man, the individuals who 
take the risks, that is, who are given the drug, are not those who will 
benefit from its subsequent use. In phase I clinical trials, normal 
subjects are given a drug to examine its pharmacokinetics and look for any 
unexpected adverse effects which have not been detected in animal testing. 

2) In phase II, when the drug is first administered to patients, these 
patients will be selected on the possibility that the drug is effective 

in their disease. However, there is no statistical basis on which to guess 
the likelihood that the effect will be desirable. 

3) While we assume that risks and benefits should be assessed in terms of 
weighing statistical probabilities, there are always numerically 
undefinable qualitative differences which must be taken into account. 



22-1 



For example, a much lower per cent likelihood of a fatal reaction is 
acceptable for an agent used to treat a non-fatal illness compared with 
a drug used to treat a fatal illness. In our analysis we must assess what 
per cent of skin rashes should be the equivalent of what per cent of 
episodes of blood dyscrasia. This problem is compounded by the fact 
that the numbers of subjects are usually so small that the statistical 
inferences can only be made within very broad confidence limits. This is 
particularly true for infrequent but serious adverse drug reactions. 
Historically such reactions have never been fully appreciated until an 
agent has been marketed and used in large numbers of individuals for some 
years . 

4) Finally, risk-benefit analysis should include assessment of the risks 
attendant upon failure to develop a new agent or procedure, and the loss 
of benefits due to delays in developing an agent or impediments in making 
it available for general use. In terms of national and world health this 
is certainly, numerically the most important kind of failure of any health 
system to bring its maximum benefits to the greatest number of individuals. 

In Section I of this paper, I will discuss the first problem in some 
detail. Section II will touch on other aspects more briefly. My point of 
view will be that of a clinical scientist; the concepts and examples 
will be derived from the diagnosis and treatment of "organic" illnesses, 
that is excluding psychologic research and therapy of psychiatric illness 



22-2 



in which I have had no personal experience. 

I. Is it valid to apply risk benefit analysis when the riskers and the 
beneficiaries represent totally different populations ? 

A simple answer to this question might be, "No, but we have to do 
something like it, so let's get on with the job." In a phase I clinical 
trial there must be prior toxicity studies in animals, sufficient to 
predict an extremely small risk of permanent physical injury, particularly 
of death, at the doses initially administered in man. Nevertheless, these 
risks can never be reduced to zero. The question at issue is to what extent 
and with what safeguards will human volunteers be allowed to take such risks 
for the benefit of others. While differing enormously in practice, in prin- 
ciple, religious sacrifice to appease the gods had the same motivations 
and was intended to serve the same needs of society. Societies which 
performed sacrifices believed that they would receive important benefits 
in the form of more rain, better crops, or victory in war. We expect to 
receive benefits in terms of better health by having humans take experi- 
mental risks. One major difference is that the scientific method should 
enable us to determine, after the fact s whether the expected benefits 
were actually obtained and whether the risks in our use of human volun- 
teers were actually small. A second difference is that in primitive 
societies suffering was considered necessary for sacrifice to be effective 
while the goal in human research is the avoidance or mitigation of human 
suffering. 



22-3" 



However, one point which is emphasized by recognizing the common 
conceptual ancestry is that "not all volunteers are really volunteers." 
The Mayans sacrificed prisoners of war; we ask prisoners of society to 
volunteer for phase one trials. This does not make the use of prison 
volunteers in phase one trials indefensible. We consider ourselves to 
be an essentially moral society, so, I suspect, did the Mayans. The 
difference lies in the value placed on human life and freedom. Hence, 
to satisfy our moral tenets, we must believe that prisoner volunteers 
are true volunteers. If a prisoner can obtain decent food and housing, 
proper treatment from custodians, or consideration for early parole 
only by volunteering then there is coercion. If the only difference 
between a prison volunteer and a non-volunteer is a small compensation 
for the time and discomfort involved then the system may be truly 
voluntary. The gap between morally defensible and morally indefensible 
may seem large but as with all such polarities there are many gradations 
in between and these can change with time. This is apparent when considering 
some of the abuses in human experimentation which lead to our present 
concern. Experiments involving the injection of cancer cell and delays 
in antisyphilitic therapy seem morally indefensible now, but were 
presumably considered defensible by those who carried out or approved 
them. If one examines the records of hospital human investigation 
committees one can find evidence of changing criteria - recently these 
have been largely in the direction of greater concern for the safety 
and freedom from coercion of experimental subjects. The fact that 



22-4 



prisoners eagerly volunteer to be experimental subjects does not resolve 

the moral issue. In fact it may indicate how strong the element of coercion 

is, that is the degree to which becoming a subject for a phase one 

trial is advantageous to the prisoner and not volunteering is disadvantageous. 

If so much benefit accrues that it would be better to take a substantial 

risk of physical harm than not to volunteer there is something wrong 

with the penal system. 

A special case in which the individual taking the risk is less likely 
to receive any benefit is the clinical trial in which a placebo or dummy 
treatment is used. It could be argued that this is no longer an important 
issue in clinical pharmacology since drugs which have been shown to be 
better than a placebo are now available to treat a wide variety of sub- 
jective symptoms. Hence, any new agent should be compared with the best 
agent previously available for that symptom or disease; and thus both the 
treated and control groups would be likely to benefit . Except for trials 
which are designed to assess minor, quasi-therapeutic effects, such as a 
study to determine whether caffeine really helps students stay awake 
while studying, or trials on agents which may have small or subtle effects 
on mood or behavior, the use of placebos is becoming less necessary and 
less justifiable in therapeutic research. Probably the most important 
current need for placebo controlled trials is in the areas where they are 
unlikely to be undertaken because of practical difficulties or societal 
condemnation. For example, it might be worthwhile to repeat, using 



22-5 



modern techniques, a study done years ago in which a group of patients 
were subjected to surgery intended to increase coronary perfusion and a 
control group actually underwent a dummy operation. The current solution 
is to have the control group treated medically, not subjected to a placebo 
operative procedure. While this is more easily defensible on moral and 
practical grounds, it may well be that the ef fectivenesss of expensive 
coronary bypass surgery will be experimentally validated not because of 
its cardio-vascular effects, but because an operation has an extremely 
powerful placebo effect. 

The principles for obtaining volunteers from other closed populations 
such as students, military personnel or patients in chronic care facilities, 
should be similar to those for prison volunteers. For students it is 
particularly important to separate the roles of teacher and evaluator from 
that of investigator so that students will not feel constrained to 
volunteer to get better grades or recommendations from faculty members. 
Clearly the best method for obtaining volunteers would be by recruiting 
from society at large, using appropriate advertisement. Even if volunteers 
are truly free, there remains the additional problem of determining whether 
some should be prevented from volunteering "for their own good". This 
involves both philosophic questions of the limits of individual freedom 
and psychiatric questions of the evaluation of mental competence. Society 
often errs on the side of excessively restricting individual freedom to 
volunteers and undervaluing the mental competence of its members. 



22-6 



If consent is truly informed and risks are minimized, then it seems 
inappropriate to deny the right to volunteer because of what is judged 
to be an inappropriate personality or insufficient mental competence. The 
critical judgement should be whether informed consent is based on sufficient 
information which is sufficiently comprehended. Theoretically it is possible 
to inform individuals who are mentally ill, below the age of legal consent, 
or have relatively low intelligence or little education, provided that the 
means are appropriate. This is simply an extension of the general problem; 
to obtain informed consent one has to provide information in terms that 
can be understood by the individuals asked to give consent. If there is 
no communication there can be no informed consent. One cannot obtain 
consent from fetuses or infants or patients in coma. 

There remains the most difficult question; whether anyone can 
decide that non-consenting human subjects should be used for an experi- 
mental procedure. I believe that proper mechanisms for such experimentation 
must be developed because important advances in the prevention and treatment 
of human disease sometimes cannot be achieved by any other means. However, 
the usual mechanisms of review by the institution coupled with informed 
consent by the parent or guardian are not sufficient. A judicial state 
or federal review procedure is required to determine whether the benefits 
are sufficiently large, the risks sufficiently small and most important, 
whether there is no alternative method of obtaining the desired information. 



22-7 



In a phase I trial on volunteers who are not expected to benefit from 
the agent being examined, the usual criteria for informed consent may not 
really be relevant. Neither the probability nor the nature of adverse effects 
is truly known. Information can be given based on animal trials but its 
uncertainty must be emphasized. On the other hand, it does seem appropriate 
to tell the volunteers in a study what the expected benefits to the other 
members of society might be. In other words, any volunteer should have the 
privilege of knowing why they are being asked to take a risk, and be 
treated with the dignity and respect that one should accord an active 
participant in the research process. If the prospective volunteer doesn't 
think a risk is worth taking for the benefit being sought this should be 
sufficient reason for them to refuse to participate. 

Even when volunteers are free to give or withhold properly informed 
consent, a different procedure may be required to assess the risk-benefit 
equation simply because the riskers and beneficiaries are different 
individuals. It may help to carry out the initial assessment of risks and 
benefits separately before looking at them together for comparative 
weighing. There are several reasons for this. First, the assessment of 
risk may involve different forms of expertise and certainly involves 
different societal considerations than the assessment of benefit. Second, 
the risk-benefit equation cannot be balanced internally by a single 
institutional review committee. The risks will be taken in one institution 
but the benefits will accrue outside it. Of course, in any experiment there 
is potential benefit to individuals and society outside the purview of the 



22-8 



institutional review group, but some potential beneficiaries will be in 
the institution and the review group will have some understanding of these 
problems or access to local experts who do . Where the benefits are external, 
separate expert consultants and advocates are needed to assess the potential 
benefit of the research. 

In assessing the risks for a volunteer group in a phase one trial one 
needs information from animal studies and an analysis of potential risks 
based on the experience of clinical pharmacologists whose special area of 
competence is adverse drug reactions. In addition, the volunteers need an 
advocate both to assure their general rights and to ascertain that there 
is no coercion. In assessing the benefits there should be input not only 
from those who are sponsoring the drug, but also from disinterested experts, 
in the therapy of the disease or condition for which the drug is intended 
who can testify as to the degree of need for additional or new therapy and 
the likelihood that the therapy to be tested will fill that need. An 
additional advocate who represents the patient population at risk should 
have input. The final review must assess the material on risk and benefits 
coming from different sources and attempt a balance. This should be carried 
out by a group which is not only broad in composition, but includes indivi- 
duals who are independent of the institution where the initial research 
is carried out. Appropriate mechanisms could be developed at the community, 
state or federal level. The level used might depend on the nature and ma- 
gnitude of the project. Ideally multiple levels should be available for 



22-9 



appeal. At present review is carried out at the federal level by the 
Food and Drug Administration. The mechanisms are over-centralized, 
sometimes cumbersome and community and societal interests, particularly 
these of potential beneficiaries, may not be fully appreciated. It seems 
inappropriate to ask the FDA to add to its already heavy administrative 
load such an extensive consideration of the ethical, moral and social 
issues which are so often involved in clinical trials. The formation of 
a separate national review body might be a logical extension of the work 
of the National Commission. 

The tripartite approach discussed above may sometimes also apply to 
risk benefit analysis in studies of non-therapeutic procedures. While 
not ordinarily considered a part of clinical pharmacology, such studies are 
an important part of clinical research. Generally a diagnostic procedure, 
although experimental, is intended to be of benefit to the patient upon 
whom it is performed. However in the development and evaluation of a new 
diagnotic test, values on a series of control subjects are generally needed. 
Where only blood and urine samples are obtained, this does not present 
great problems; the normal volunteers undergo essentially no risk and only 
the minimal discomfort of a venipuncture. The control material for tests 
involving biopsies are ordinarily obtained from autopsy material, however 
there is considerable current interest in utilizing tissue and organ culture 
to examine biopsy specimens functionally. To evaluate functional diagnosis 
in disease properly it is essential that similar material be obtained from 



22-10 



unaffected individuals. Hence volunteers may be asked to undergo skin, bone, 
intestinal and liver biopsies. In addition there are many diagnostic proce- 
dures which involve the injection or ingestion of drugs or dyes which can 
produce occasional adverse reactions. Since the risks in these two instances 
here are quite substantial and those taking the risk will not benefit 
medically, the complex tripartite evaluation scheme recommended for phase 
one trials ought to be applied. Unfortunately this direct and suitably 
monitored approach has often been circumvented by obtaining "control" 
data from those patients subjected to a particular procedure who do not 
turn out to have the disease in question. Such an approach leads to the 
temptation, perhaps unconscious, to test for a diagnostic possibility in 
a patient in whom the possibility is highly unlikely, simply to obtain 
additional data on a particular test or procedure. In this case risk- 
benefit analysis is applied in the more usual way discussed in part II 
of this paper, but in fact those asked to take the test are really not 
potential beneficiaries if the test is irrelevant. The best way to 
avoid this misapplication of a diagnostic test is to insist that the 
risk-benefit analysis be applied by the tripartite method. 

Finally I would like to mention a disparity between riskers and 
beneficiaries which the National Commission may not consider as part of 
its charge, but which could reflect on our national morality. We are 
increasingly dependent on other countries for the development and 
evaluation of new drugs. We congratulate ourselves on avoiding the use 



22-11 



of thalidomide, but we could only know the risk because others took it. 
Clearly we should not take risks simply because investigation and review 
bodies in other countries are willing to do so. However we must also be 
careful not to use this willingness for our own benefit. On a recent visit 
to Africa I was concerned that foreign pharmaceutical firms might be using 
African patient populations to test new drugs with less regard for 
safety than they would have had in using their own nationals as subjects. 
To apply rigid criteria at home and tacitly approve less safe trials 
abroad is not morally defensible. 

II. Risk-benefit analysis when the risks and the benefits are likely 
to accrue to the same individuals or groups . 

This problem can be divided into two parts: 
A) those circumstances in which the risks or the benefits are small, but 
of sufficient substance to make an analysis worth considering and, 
b) those circumstances in which both the risks and benefits are large. The 
latter applies to the development and evaluation of therapy for serious 
illnesses for which current treatment is not adequate. The circumstance 
in which the risks are large and the potential benefits small is obviously 
one to be avoided. However as examplified by the thalidomide disaster, and 
the experience with chloramphenicol the existance of excessive risks 
may not be appreciated until extensive trials have been conducted. 
Investigators and review groups must be alert to this possibility so that no 
further studies will be conducted once this disparity between risk and 



22-12 



benefit known to exist. The situation in which the risks are small and 
the benefits large is simply a desirable extension of the second category. 

A. Many trials in which both risks and benefits are small involve dispari- 
ties between those who take the risks and those who will benefit but the 
approach may be different from that considered in part I. Research in 
clinical pharmacology often involves the evaluation of agents which are 
expected to bring definite but limited benefits to the subject and to 
other patients with similar disorders. Such agents may turn out not to 
be beneficial to many of the patients treated initially. For example, in 
the evaluation of a new analgesic designed to replace aspirin in patients 
who cannot tolerate aspirin, the new drug might be used in patients who 
can tolerate aspirin and therefore are best treated with the older established 
drug. Such use can be justified because the risk is small and transient, 
and the benefit to others appreciable. Similarly in the reevaluation of 
currently available drugs or particular uses of those drugs which are 
of questionable merit, the expectation may be that there will be little 
benefit to the patients in the trial. There would be a benefit to future 
patients and to society if it could be clearly shown that a particular 
use of that particular agent should be discontinued. Risk-benefit analysis 
in this situation usually does not present unsurmontable difficulties and 
does not require the complex tripartite evaluation discussed in Part I. 
The risks are usually well known for already established agents. If the 



22-13 



agents are to be used for relief of minor symptoms, low risk must be 
demonstrated and the benefits are usually such that both physicians and 
non-physicians can appreciate and evaluate them. 

One serious problem in risk-benefit analysis for drugs of this type 
is in dealing with what might be termed the information-use gap. Information 
derived from a trial is rarely used optimally for several reasons: 1) It 
may be difficult or inappropriate to apply the information of the trial to 
the larger population at risk. Consider the television advertisement for 
a drug taken predominantly for headache, in which the huckster points 
out that in studies on "pain other than headache, doctors at a teaching 
hospital and major medical center found agent X to be superior". 2) Indi- 
vidual clinical trials can be assessed by appropriate statistical means 
and careful descriptions of the patient population can be presented, but 
only after the accumulation of a number of such trials and the analysis 
of many relevant patient and disease factors can one arrive at a consensus 
concerning the therapy of larger populations. The validity of this con- 
sensus cannot be tested by ordinary statistical means. It is a matter of 
weighing evidence which seems more judicial than scientific. Hence we find 
physicians telling about their clinical judgement and we are faced with 
the difficult problem of deciding which physician's judgement to accept. 
Perhaps some of the difficulty in this area would be resolved if those 
who are asked to weigh the evidence were trained not only as physicians and 
scientists but as lawyers and judges. The group which make such decisions, 
be they hospital pharmacy committees, state or federal purchasing agents 



22-14 



or the National Research Council Advisory Boards to the FDA might profit 
from more input by those familiar with judicial procedure. Scientific 
conclusions based on reproducibility, statistical validity and quality 
of the experimental design could be enhanced by the judicial assessment in 
the traditional terms of competence, relevance and materiality. 3) While 
such an approach might help us make a better assessment of therapeutic 
questions, it will not insure that new judgements, whatever their quality, 
are distributed appropriately. The availability of a careful assessment 
is not sufficient to close the information-use gap. After the risks and 
benefits of a particular therapy have been analyzed these must be presented 
so as to be understood by those who will use the therapy. The proportion 
of adverse reactions to a given drug which occur due to misinformation, 
misunderstanding or misuse by the physician or patient is generally 
much greater than the proportion of adverse reactions which occur 
because of unavoidable side effects during correct use of that drug. 
There is no absolute way of ensuring that the appropriate instructions will 
be carried out by physicians, patients or society. We have few groups 
which attempt to monitor the use and distribution of therapeutic agents 
and the findings of such groups may have little effect on the general use 
of an agent. This is a particularly severe problem in a capitalist system 
where profit has a powerful impact on the development and distribution of 
drugs. In the past beneficial drugs have not been marketed, because they 
were not profitable. The problem is compounded by the fact that physicians 



22-15 



use drugs in a highly independent manner. They regard, in some cases 
correctly, the advice and instructions in package inserts and other 
informational material as excessively and inappropriately restrictive. 
Thus the information-use gap may also occur because the official infor- 
mation has not kept pace with non-official information which nevertheless 
influences current use. 

B. In considering the problem of assessing risks and benefits when both 
are large we need to take a fresh look at the relationship between patient 
and healer. Traditionally, the patient with a serious illness for which 
definitive therapy is not available is advised to seek out an outstanding 
physician (usually defined as one in whom others have much confidence) , put 
themselves in the hands of that physician and do what they are told. This 
demonstration of faith is the fundamental tenet of the primitive healing 
arts, and remains the principle by which quackery, folk medicine and a wide 
variety of dubious cures still gain acceptance. On the other hand good 
healers, dedicated to their patients welfare and well-versed in scientific 
medicine also make extensive use of "faith in the physician" to carry out 
their therapy. Is it appropriate to ask a patient to accept this relation- 
ship and at the same time ask them to take part in an experiment ? In this 
setting it seems more appropriate to engage the patient as fully as 
possible as a partner in a scientific enterprise. To do this effectively 
may require a change in the attitude of society towards therapeutic 
research. Today sick patients are generally ill-prepared to take an active 

22-16 



role in decision making. I do not believe this is because sickness robs 
them of their judgement or because sick patients are intrinsically inca- 
pable of taking part in a decision concerning their own welfare. Rather 
it is because the tradition of faith in the physician is currently so 
powerful and pervasive. How common it is, after a long explanation of a 
patient consent form to hear the patient say: "I'll do what ever you think 
best, doc". We must realize that the reason for this response may be 
that patients think that physicians expect it and are afraid to voice their 
underlying concerns. A substantial amount of education of both physicians 
and patients would be required to change this response. Nevertheless I 
believe that such education is necessary if we are to pursue clinical 
investigation actively in an era when new and powerful agents are continuously 
being made available, and require rapid induction. 

Two additional problems arise when the risks are large and the 
potential benefits are great. One is the problem of whether, even with 
informed consent, individuals can be asked to take a substantial risk on 
the possibility that their health will improve. We allow individuals to 
take much greater risks for financial gain. How can a society which permits 
and sometimes even encourages death-defying stunts prevent a sick individual 
from taking a substantial risk in the hope of gaining health, or even stop 
a heroic martyr from taking a substantial risk in the hope of achieving 
better health for others? Fear of legal reprisals as a result of the mal- 
practice explosion may have a powerful but inappropriate influence on 

22-17 



risk-benefit analysis in this situation. Better methods must be devised 
for dealing with the malpractice issue in clinical research. 

III. Recommendations 

Much of what follows has already been suggested in the discussion 
above. My recommendations for better procedures for risk-benefit analysis 
have been generated from experience in academic medicine in a hospital 
setting, as a clinical investigator and as an active participant on both 
sides of the institutional review procedure. 

1. My major recommendation is that new procedures be developed for risk- 
benefit analysis of studies in which those taking the risks are different 
from those who benefit. As described above, I believe that there should 
be a tripartite review system for such studies. One group would have the 
appropriate expertise to analyze the risks and judge the propriety of the 
selection of volunteers to be certain that there is no element of coercion. 
The second group would consider the potential benefits and provide a disin- 
terested evaluation of the likelihood that such benefits will eventually 
accrue. These two groups should then present their findings for actual 
risk-benefit analysis to a third group. This third group has the most 
difficult task. They must weight the personal risks taken by the volunteers 
against societal and personal benefits for others. It is clear that this 
group has a quasi-judicial function and should have the benefit of indi- 
viduals trained in judicial and legal procedure. In carrying out this 
review, the risks of not doing the study should be carefully presented 
and considered. 

22-18 



The review system should have an appeal procedure embodied in it. It 
is possible that this could occur in several steps beginning at the local 
or institutional level and carrying through to the State and Federal 
levels. However it is implicit in a tripartite review system that the 
adjudicating group should not represent the institution at which the 
experiment on volunteers is to be conducted, but should have larger 
community representation. 

2. For that large proportion of human investigation in which the patients 
asked to undertake a risk are also likely to benefit, because the therapy 
under study is designed for their disease, the present system of institu- 
tional review appears to be quite adequate. Such institutional review groups 
in hospitals, medical schools, and research institutes should have guide- 
lines to help them determine whether in a particular instance risks and 
benefits are so separate that the more complex tripartite procedure might 
be appropriate. This problem can be identified easily when a specific 
phase I study of a new drug is being carried out in such an institution. 
The evaluation of a laboratory test in normals might come under further 
scrutiny, but only in those instances where there is some substantial 
risk involved. In the present composition of institutional review groups 
the regulation that representatives of the legal profession, the clergy and 
lay persons be included seems reasonable. At the moment it does not seem 
necessary to require that there be specific research advocates, that is 
individuals who will take it as their duty to point out the usefulness of 



22-19 



research, and the risks of not doing research. It may be that as review 
becomes more stringent and regulations become complex this function of 
institutional review will also have to be specified. 

3. Probably the most important and difficult problem is that of improving 
the dissemination and application of therapeutic information and closing 
the information-use gap. No single approach will solve this problem. 
A large number of changes ranging from reorganization of the distribution 
of medical care to improvement in public relations and the development of 
better instruments for informing physicians of new developments in thera- 
peutics must be considered. The important first step is to recognize formally 
that this gap represents a major defect in our health system. Efforts to 
close it must be supported at all levels; local, State and Federal, from 
both the public and private sector and using a wide variety of techniques. 



22-20 



INFORMED CONSENT 



23 

NATURE AND DEFINITION OF INFORMED CONSENT 
IN RESEARCH INVOLVING DECEPTION 

Diana Baumrind, Ph.D. 
January 28, 1976 



Preface 

My charge from the Commission is to discuss the nature and definition 
of informed consent in research involving deception. The discussion will 
not present a balanced and impartial view of all sides of this admittedly 
complex issue. Rather, I shall speak as a social scientist and to those 
issues which affect social scientists. Only in passing shall 1 be concerned 
with related ethical problems as they apply to biological and medical research. 

For more than 20 years, I have been actively engaged in the practice of 
behavioral science research, and for more than half that time with the ethical 
issues which are raised whenever one does research with human subjects. I am 
known to hold a non-permissive position regarding the use of deception, and 
I shall speak as an advocate of that position. This I feel free to do in the 
expectation that Dr. Berkowitz, who has been asked to prepare a paper on the 
same subject, will argue in defense of research employing deception. Taken 
together, our two approaches should provide--at the least--one basis for a 
much needed dialogue. 

By comparison with journal writing my style will be leisurely and to 
some extent repetitive. I shall risk redundancy for the sake of clarity and 
assume that the repetition of the same argument in different contexts is a 
necessary-- i f sometimes t i resome--correct i ve to possible misunderstandings. 
I shall use the male pronoun to stand for the human person because I find its 
avoidance in a philosophical paper too cumbersome. 
Definition of Problem 

Nature of Deception 

Deception can be classified as nonintentional or intentional. Noninten- 
tional deception, which includes absence of full disclosure, failure to inform 
and misunderstanding, cannot be entirely avoided. Full disclosure of everything 
that might affect a given subject's decision to participate is a worthy ideal 

23-1 



but not a real possibility. For example, in the case of young children and 
partially disabled adults the investigator must content himself with absence 
of dissent and with assent rather than consent. While the youngest child can 
communicate unwillingness to participate (dissent) and a somewhat older child 
can indicate willingness to participate but without full understanding of 
what will be required (assent), only the mature, reflective adult is truly 
capable of fully informed consent. All secondary analyses of data and some 
acceptable studies of public behavior commit "failure to inform," another 
form of nonintent ional deception. And finally, since perfect communication 
is impossible to achieve there is probably always some degree of misunder- 
standing in the contract between researcher and subject. However regrettable, 
such misunderstanding is inevitable and as such is not a proper subject for 
this essay. 

My concern in this paper is primarily with intentional deception. This 
includes the withholding of information to obtain participation, concealment 
in natural settings, manipulation in field experimentation, and deceptive 
instructions and manipulations in laboratory research. 

The function of deception in social psychological experimentation is to 
construct relevant experimental controls by means of fictional environments. 
Fictional environments are designed to induce specific sets or expectancies in 
subjects by the creation of false social norms, by the use of misleading verbal 
instructions or by the presence of nonfunctional visual props including elec- 
trical and electronic gear (Seeman, 1969). The presumed function of conceal- 
ment and withheld information is to cancel the effect of the observer on the 
phenomena being observed in the interest of object ivi ty--a goal that physicists 
have long since rejected on theoretical grounds (the Heisenberg principle). 

23-2 



Incidence of Use of Deception 

The use of deception continues to be the rule rather than the exception 
in social psychological research today. No professional organization abso- 
lutely prohibits deceptive practices of the kind it associates with good 
research. The very thoughtful code of the American Anthropological Associ- 
ation (attached) (1973), does not prohibit inobtrusive surveillance; the 
extremely perfunctory code of the American Sociological Association (attached) 
(1968), contains no prohibitions at all nor does it require informed consent; 
and the extensive revised code of the American Psychological Association (1973), 
while advising against deceptive experimental practices, condones deception in 
all cases where the presumed benefit exceeds the presumed cost. 

Several surveys document the use of intentional deception in social psycho- 
logical research. Strieker (1967) surveyed the four major social-psychological 
journals published in 196*t, {Journal of Abnormal and Social Psychology (JASP) , 
Journal of Personality (JP) , Journal of Social Psychology (JSP) , and Socio- 
metry) . He found that some areas of research use deceptive strategies almost 
to the exclusion of nondeceptive strategies. Thus 81% of conformity studies 
and 72% of cognitive dissonance and balance studies involved deception, while 
such strategies rarely occurred in learning and attitude studies. Seeman ( 1 969) 
analyzed the total published literature in the JP and the JASP from 1 9^8- 1 9&3 
for use of deceptive strategies. The mean figures combined for 19^*8 is 18.*»7% 
and for 1963, 38.17%. According to Menges (1973), also surveying JP and JASP, 
the percentage of studies reporting use of deception was 16% in 1961 and 38% 
in 1971- In 1973 the American Psychological Association (APA) revised its 
code of ethics giving careful consideration to the issues of informed consent 
and deceit in laboratory and field settings. If the revised code effectively 
reduced the incidence of deceptive practices we might expect to see a drop in 

23-3 



the incidence of published reports. I therefore examined the September 197** 
issue of Journal of Personality and Social Psychology (JPSP) , the official 
journal of the APA in the areas of personality and social psychology (which 
now replaces both JP and JSP), to see if a drop had indeed occurred. Of the 
15 empirical studies reported, six used deceptive instructions in an intention- 
al attempt to manipulate the subjects' set or to create false social norms. 
Thirteen months later (October 1975) I examined JPSP again for incidence of 
deception. The Table of Contents is included as Table 1. There were 20 
empirical reports among the 22 papers. Of these, 13 employed deceit. Of the 
13 that employed deceit, three were trivial instances (numbers 8, 12, and 17) 
in which (in my judgment) no harm, including loss of trust, could ensue either 
from the procedures themselves or from the disclosure of deceit in the de- 
briefing. In number 8, subjects were told that their discussions were being 
videotaped when they were not; in number 12 that they would be "overcrowded" 
when they were not; and in number 17 that the lists of digits presented to 
them followed a certain order when in fact the order was random. 

Ten studies employed nontrivial deceit which in my view involved clear 
violations of the ethical principles of the APA and/or could result in real 
psychological harm to the subjects. Of these 10, six made no mention of de- 
briefing. Subjects, with two exceptions, were introductory psychology students 
or freshmen. Most of these studies dealt with such socially important themes 
as altruism or conformity and thus could be justified by the usual cost/benefit 
rationale. One of these ten (number 3) used deceptive instructions with 7"10 
year old children to measure altruism; subjects were exposed to adult models 
behaving either altruistically or selfishly and were told that their winnings 
(preset, not genuine, scores) could be donated to poor children. Were debrief- 
ing used (none was mentioned) the children who had behaved selfishly would have 

23-4 



Table 1 
JOURNAL OF PERSONALITY AND SOCIAL PSYCHOLOGY 

VOLUME 32, NUMBER k OCTOBER 1975 

1 Environmental Noise Level as a Determinant of Helping Behavior /Kenneth E. 

Mathews, Jr. , and Lance Kirkpatrick Canon 57 1 

2 Does the Good Samaritan Parable Increase Helping? A Comment on Darley and 

Batson's No-Effect Zor\c\ us \ on /Anthony G. Greenwald 578 

3 Saying and Doing: Effects on Observer Performance/Mamie E. Rice and Joan 

E. Grusec ' 584 

4 Impl icat ional Principles and the Cognition of Confirmatory, Contradictory, 

Incomplete and Irrelevant lnformation/Gor<ion Bear and Alexandra Hodun 594 

5 A Closer Examination of Causal Inference: The Roles of Consensus, Distinc- 

tiveness, and Consistency Information/Bruce R. Orvis, John D. Cunning- 
ham, and Harold H. Kelley . 605 

6 Effects of Personality-Situation Locus of Control Congruence/TTzomas K. 

Srull and Stuart A . Karabenick 617 

7 Skill Versus Luck: Field and Laboratory Studies of Male and Female Prefer- 

ences/Kay Beaux, Leonard White, and Elizabeth Farris 629 

8 Persons, Situations, and the Control of Social Behavior/Mark Snyder and 

Thomas C. Monson 637 

9 An Experimental Study of Crowding: Effects of Room Size, Intrusion, and 

Goal Blocking on Nonverbal Behavior, Self-Disclosure, and Self-Reported 
Stress/Eric Sundstrom 645 

10 Differential Effects of Jury Size on Verdicts Following Deliberation as a 

Function of the Apparent Guilt of a defendant /Angelo C. Volenti and 
Leslie L. Downing 655 

11 Attraction and Expectations of Harm and Benefits /Barry R. Schlenker, 

Robert C. Brown, Jr. , and James T. Tedeschi 664 

12 Waiting for a Crowd: The Behavioral and Perceptual Effects of Anticipated 

Crowd i ng/ Andrew Baum and Carl I. Greenberg 67 1 

13 Effects of Noncontingent Reinforcement on Tasks of Differing Importance: 

Facilitation and Learned Hel plessness/5wsa« Roth and Larry Kubal .... 680 

14 Visual Versus Verbal Information in Impression Format \onlShigeru Hagiwara 692 

15 Frequency of Reciprocated Concessions in Bargaining/5. S. Komorita and 

James K. Esser 699 

16 The Mediation of Aggressive Behavior: Arousal Level Versus Anger and Cog- 

nitive Labe 1 i ng/Vladimir J. Konecni 706 

17 Need Achievement and Risk-Taking Preference: A Clarif ication/JbTzn C. 

Touhey and Wayne J. Villemez 713 

23-5 



Table 1 - Continued 

18 Sex Differences in Moral Internalization and V al ues/ Martin L. Hoffman . 720 

19 Psychological Differentiation as a Factor in Conflict Resolution/PTziZip 

K. Oltman, Donald R. Goodenough, Herman A. Witkin, Norbert Freedman, 

and Florence Friedman 730 

20 Children's Use of the Multiple Sufficient Cause Schema in Social Perception/ 

Michael C. Smith 737 

21 The Relationship Between Attitudes and Beliefs: Comments on Smith and 

Clark's Classification of Belief Type and Predictive \Ja]ue/Kerry 

Thomas 7^8 

22 When Self-Interest and Altruism Conf 1 ict/Robert J. Wolosin, Steven J. 

Sherman, and Clifford R. Mynatt 752 

List of Manuscripts Accepted 7^7 



23-6 



suffered shame and guilt. At best, all subjects were left with false notions 
about their own performance and that of adult models. In two studies (numbers 
11 and 23), experimenters delivered mild electric shocks as well as false 
instructions to undergraduate students with no debriefing by one (number 11). 
Another study (number 6) encouraged college students to cheat by using false 
instructions; no debriefing was mentioned. Data from two studies (numbers 1 
and 2) investigating helping behavior were obtained by staging incidents 
with "victims" supposedly in need of assistance. In the 10 studies where 
nontrivial deceptive practices were used, informed consent was entirely pre- 
cluded. 

Nature and Definition of Informed Consent 

Under the APA code of ethics and the present HEW guidelines, informed 

consent means the consent of a person (or his or her legally authorized 

representative) so situated as to be able to exercise free power of choice. 

Free power of choice, in turn, implies that choice be made on full and accurate 

information, including an accurate explanation of the procedures to be followed 

and a description of any attendant discomforts or risks reasonably to be 

expected. As usually stated, there are six basic elements of informed consent. 

In order to distinguish my use of the term informed consent from the usual 

literal interpretation, I will follow each element with a comment. 

1. A fair and understandable explanation of the nature of the activity, 
its purpose, and the procedures to be followed, including identi- 
fication of any procedures which are experimental. 

The investigator should not be required to disclose to the subject the 

purpose of the experiment. The requirement that in effect the investigator 

, share his hypotheses with subjects would invalidate most social science research. 

Obviously subjects' behavior will be affected by explicit knowledge of the 

investigator's hypotheses. It is deceitful for the investigator to misinform 



23-7 



the subject as to the purpose of the experiment, but not to explicitly with- 
hold information. It is sufficient to indicate to the subject that such 
information cannot be shared during the initial briefing but will (or will 
not in some cases) be disclosed at the debriefing. Subjects should also be 
informed of the possibility that there will be secondary analyses of data. 
Some potential subjects may refuse to participate on the grounds that they 
will not be permiited to censor future use of the data. That is their right. 
However, having been informed and having consented, subjects should not be 
given the right to veto in primary or possible secondary analyses the inves- 
tigator's use of his findings. 

2. An understandable description of any attendant discomforts and 
risks reasonably to be expected. 

3- An understandable description of any benefits reasonably to 
be expected. 

Written statements of possible risks and benefits sound to subjects like 
threats and promises and are, I think, counterproductive. Where there is a 
possibility of attendant discomforts and risks, these should be discussed 
with the subject in a briefing interview so that wherever possible procedures 
can be accomodated to the subject's needs, or his inappropriate anxieties 
dispelled. While the investigator may promise specific rewards such as feed- 
back information, referral or money, he can seldom determine in what ways the 
experience will be intrinsically beneficial or rewarding, although he can 
express his hopes that it will be. 

*». An understandable disclosure of any appropriate alternative 
procedures that might be advantageous for the subject. 

In behavioral research this alternative is not really open to the sub- 
ject. Generally he must be assigned to a particular experimental or control 
group. It is essential, however, that the subject consent to the procedures 
to which he will be subjected. 

23-8 



5- An offer to answer any inquiries concerning the procedures. 

The heart of informed consent is the right of the subject to be in- 
formed as to the actual nature of the experience which he is to undergo. 
It is to these procedures that the subject is consenting or withholding con- 
sent. Incomplete or inaccurate information here is tantamount to intentional 
deception. 

6. An understanding that the person is free to withdraw his or her 
consent and to discontinue participation in the activity or the 
project at any time prior to its termination without prejudice 
to the subject. 

Provided that fully informed consent has been obtained, the investigator 
should retain the right to encourage the subject to continue unless it becomes 
clear that the subject is being more than mildly inconvenienced. Certainly 
the experimenter should retain the right to withhold payment proportionate 
to the loss of the subject's services. Respect for the subject dictates that 
he has responsibilities as well as rights. While not constituting a legal con- 
tractual obligation on the part of the subject to continue or complete his 
service, acceptance of a prior fee morally obligates him to fulfill his part 
of the agreement. This obligation, of course, presumes that the subject has 
given his informed consent. And since the responsibility for assuring that 
consent is based on adequate information rests with the experimenter, any 
evidence that the subject did not anticipate the actual effects upon him suf- 
fices to relieve him of his contractual obi igat ions--and this without finan- 
cial or psychological penalty. But mere inconvenience should not relieve 
the subject of his moral obligation to continue and the experimenter should be 
able to exert tactful pressure towards that end, including withholding of pay- 
ment for services not rendered. 
Cost/Benefit Approach to Justification of the Use of Deception 

Judging by their behavior, social scientists who use deceitful practices 

23-9 



do not regard such practices as immoral. Yet these same scientists would 
not condone the normative use of deceit in everyday personal relations. In 
the practice of their profession, however, these scientists use deceitful 
practices openly, publish their procedures without apology and indeed with 
prideful exhibition of ingenuity (e.g., Milgram, 1963), teach their students 
to copy their example and reward them when they do, and vigorously defend 
their procedures when attacked. Their justification is contained in the cost/ 
benefit principle. The experience of being deceived or not fully informed is 
not in itself viewed as a cost. Provided the study's objective is of scien- 
tific or social interest and the methodology adequate, the cost/benefit prin- 
ciple can be, and is, in most instances, invoked to justify the use of deceit. 

I will argue that the cost/benefit approach as generally applied serves 
to justify rather than inhibit the use of deceitful practices and misinformed 
consent. Moreover the costs to the subject and society are underestimated 
and the benefits to society are overestimated. 

Inadequacy of Cost/Benefit Approach 

As a basic principle of adjudication the cost/benefit justification of 
deceptive pvactices is inadequate. The cost/benefit justification of deceptive 
practices cannot be reconciled with Personal ism or any other form of universal ist 
metaethics, or with rule-utilitarianism provided that what is perceived as con- 
stituting the greatest general good prohibits the justification of lying and 
deceit. In accord with the deontological or universal ist position the basic 
judgments of obligation are present as being given intuitively without recourse 
to consideration of what serves the common good. For deontologists such as 
Kant, the principle of justice cr of truth or the value of life stands by itself 
without regard to any balance of good over evil for self, society, or the uni- 
verse. For nontheistic deontologists, morality is, I suppose, equated with 
aesthetics, requiring of the moral individual a fine sensibility and intuition. 

23-10 



This is the view of Aristotle, when he states that the decision as to what 
determines the golden mean rests with perception. For theistic deontologists 
humankind is the bearer of "an alien dignity" rooted in the value God places 
on us. Personal ism, or the idea that the life and integrity of the person 
remain of greater value than any object or function which the person may be 
called upon to serve, is central to both the Buddhist and Christian tradition. 
Nontheistic deontologists who agree with Wallwork (1975, p. 75) that "persons 
are of unconditional value" and that it is "the right of every person to an 
equal consideration of his claims in every situation, not just those codified 
into law or professional rules", must reject the costs/benefit analysis because 
it wrongly (from their perspective) subordinates basic human rights to benefits 
of whatever kind or value. According to rule-ut i 1 i tar ianism ,an act is right 
if and only if the principle under which it falls is thought to produce at 
least as great a balance of good over evil as any available alternative. Un- 
like universalist principles, rule-utilitarian rules are culturally and situ- 
ationally relative (a good thing, in my opinion). If deception is perceived 
as a fundamental principle governing an act then deception itself would have to 
be viewed as promoting the greatest general good. However, no ethical system 
does in fact condone lying and deception as a principle of action, although not 
all lies or deceptions are regarded as blameworthy, and many "white" lies are 
regarded as praiseworthy. 

Telling the truth and keeping promises are regarded as obligatory in most 
systems of ethics for many compelling reasons. Perhaps the most compelling of 
all is the belief that the coherence of the universe cannot be maintained with- 
out contract. Contracts and promises provide the same security in the social 
world which invariant cause-and-ef feet relations provide in the physical world. 
Without invariant cause-and-ef feet relations in the physical universe, 

23-11 



goal -oriented behavior would be impossible. Imagine a situation in which 
turning a doorknob could release a stream of lemonade or trigger a gun or 
any number of other possibilities, as well as open a door. Only by acting 
in accord with agreed-upon rules, keeping promises, and avoiding deceit can 
human beings construct for themselves a coherent, consistent environment in 
which purposive behavior becomes possible. Thus, the long-range good that 
truth-telling promotes facilitates self-determination or authority over one's 
own person. 

Rule-utilitarianism (to which I subscribe) unlike universalism does not 
pretend to establish the absolute validity of the ends sought. It accepts the 
possibility that deceptive research practices (or killing for that matter) can, 
under certain circumstances, be justified. The circumstances under which such 
justification is possible are those in which the rule requiring informed con- 
sent (or the not taking of human life) may be given a lower priority than the 
rule establishing freedom of scientific inquiry (or the rule prohibiting 
murder). In other words, if the values of science—to know and report — take 
precedence over the values that dictate concern for the person-integrity, reci- 
procity and justice—then should those two sets of values come into direct con- 
flict, the values of science could be justified as an ethical basis for action. 
The crux of the issue, of course, has to do with establishing a hierarchy of 
values. This may be done by demonstrating that one rule or value(' n a given 
culture at a given time) rather than another better facilitates the Good Life of 
one's own culture, humankind, or all sentient beings, depending on one's ulti- 
mate beneficiary. According to this view, if one believes (as I do) that values 
which dictate concern for the person take precedence over the values of science 
(in that factually the human values are more facilitative of the Good Life than the 
scientific ones), then a cost/benefit justification of deceitful practices 

23-12 



is proscribed. 

By contrast with a rule-utilitarian, an act-utilitarian must calculate 
the costs and benefits of every situation without recourse to the guidance 
of overriding rules or principles, an approach which leads to unavoidable 
and unresolvable difficulties. Act-utilitarianism, for example, would require 
that in each instance the individual calculate anew whether or not to obey the 
laws against running a red light or stealing for personal gain. This concrete 
approach to ethical judgment occurs in the individual at an early period of 
development and is usually superseded by appeal to rule and principle as 
soon as the individual is capable of abstract thought. Act-utilitarianism 
would seem to restrict the moral sense to a rather primitive level. Moreover, 
act-utilitarianism is preswnptious . The actor presumes that he possesses in- 
sight superior to that of the distilled wisdom contained in the principle he 
disregards. Should a witness lie in a court of law to save a defendant he 
is sure is innocent? Joseph Fletcher, the Situation Ethicist, answers: "Yes, 
he should lie if he believes the defendant would otherwise be found guilty." 
(1966). The deontologist answers: "No, a lie is always wrong." The rule- 
utilitarian answers: "No. Provided that the court system functions justly, 
the common good is best served by truth-telling." The responsibility of the 
witness is to present his evidence convincingly. If he truly knows, why 
should his evidence not convince the court? To justify his willingness to 
lie, the witness would have to uphold the right of any witness to lie, pro- 
vided that the witness felt sure in his own mind of the guilt or the innocence 
of the defendant. The principle which proscribes lying under oath is intended 
to preserve the common good by determining truth through consensual judgment 
rather than in accord with the strong conviction of any one man. Act-utili- 
tarianism is tied to the present. Consider, for example, the guarantees of 

23-13 



the rights of the accused or the minority in the Bill of Rights. The exercise 
of these guarantees often creates a situation where the protection of the 
rights of an individual will violate the common good, e.g., the exercise of 
free speech to support racism. Act-utilitarians would have to reject the Bill 
of Rights in that situation, whereas rule-utilitarians would inquire as to 
whether the common good were benefited by universal adherence to the principle 
of free speech. The rule-utilitarian would evaluate the effect on the common 
good of violating that principle rather than apply cost/benefit analysis to 
this particular instance or act (verbal defense of racism). A rule-utilitari- 
an would argue that if an objection to the content of a statement were used to 
justify a violation of free speech in this instance, then any objection to con- 
tent could be used to restrict the right of a citizen to speak out, e.g., the 
right of a pacifist to speak out against the Vietnam War. Unlike the act- 
utilitarian, the rule-utilitarian would find the guarantees contained in the 
Bill of Rights consistent with his moral philosophy because these principles, 
if generalized, would benefit the common good. Act-utilitarianism is en- 
tirely pragmatic. Matters of conscience exercise different capacities and 
appeal to different motives in humankind than matters of practical judgment. 
Suppose that act A and act B result in exactly the same ratio of cost to 
benefit, but act A involves deceit and breaking a contract, while act B invol- 
ves purchasing a cocktail dress rather than a pair of badly needed walking shoes. 
A consistent act-utilitarian would view acts A and B as both equally wrong if 
they both produced an identical score on the minus side. But from the deonto- 
logical viewpoint, or that of rule-utilitarianism, act A must be regarded as 
more unethical than act B, otherwise there is no ethical question to be 
decided, only a practical one. Most present code of ethics, including the APA 
code and the HEW regulations, are written from an act-utilitarian metaethics. 
From either a universal ist or rule-utilitarian position, the codes and their 

23-14 



metaethical justifications are inadequate. 

In practice, present codes do not in point of fact regulate the activities 
of scientists so that they conform with generally held standards of ethical be- 
havior; any rule can be violated merely by proclaiming that the benefits to 

or reducing inferential ambiguity 
human ity. just i fy the costs to subjects. The argument for violations of sub- 
jects' rights on the basis of a cost/benefit analysis is well presented in the 
revised code of ethics of the APA. 

The obligation to advance the understanding of significant aspects of 
human experience and behavior is especially likely to impinge upon well- 
recognized human rights. Significant research is likely to deal with 
variables and methods that touch upon sensitive human concerns. And if 
ambiguity in causal inference is to be reduced to a minimum-- an essential 
of good science — research must be designed in ways that, on occasion, may 
make the relationship between the psychologist and the human research 
participant fall short of commonly held ideals for human relationships. . . 
(1973, P. 8) 

According to the APA code of ethics, when a conflict between scientific rigor 

and the rights of subjects arises, the experimenter's ethical obligations to 

the subjects may be superseded. To be specific, the following rights of the 

subject are recognized explicitly in the APA code but may be suspended in the 

interests of scientific rigor: 

a. The right of the subject to be involved in research only with his 
knowledge and informed consent (Princples 3 and 5). 

b. The right of the subject to be dealt with in an open and honest 
manner (Principles h and 8). 

c. The right of the subject to protection from physical and mental 
distress and loss of self-esteem (Principle 7) • 

d. The right of the subject to a clear and fair contractual agree- 
ment (Principle 6) . 

Referring to the cost/benefit approach by which such violations are jus- 
tified, the Code states: 

Almost any psychological research with humans entails some choice as to 
the relative weights to be given to ethical ideals, some choice of one 
particular ethical consideration over others. For this reason, there are 
those who would call a halt to the whole endeavor, or who would erect 
barriers that would exclude research on many central psychological questions. 
But for psychologists, the decision not to do research is in itself a 

23-15 



matter of ethical concern since it is one of their obligations to use 
their research skills to extend knowledge for the sake of ultimate 
human betterment (1973, P- 7) • 

In making this judgment, the investigator needs to take account of 
the potential benefits likely to flow from the research in conjunction 
with the possible costs, including those to the research participants, 
that the research procedures entail. ... An analysis following this 
approach asks about any procedure, "Is it worth it, considering what 
is required of the research participant and other social costs, on the 
one hand, and the importance of the research, on the other?" Or, "do 
the net gains of doing the research outweigh the net gains of not doing 
it?" The decision may rule against doing the research, or it may affirm 
the investigator's positive obligation to proceed. Such an analysis is 
also useful in making choices between alternative ways of doing research. 
For example, "Are the costs to research participants greater or less 
if they are informed or not informed about certain aspects of the re- 
search in advance?" "What will be the effect of these two alternatives 
on potential gains from the research?" (1973, P- 11) 

The revised Code assumes moral dilemmas are inevitable in the research 
endeavor; but the function of a system of moral philosophy is precisely to 
avoid such dilemmas. In point of fact, the use of a cost/benefit analysis 
serves to legitimate the loophole known as the "moral dilemma," that is, the 
situation in which the actor believes that he is forced to choose between 
equally culpable alternatives. But it is a person's duty insofar as possible 
to avoid provoking situations which create conflicts of obligation, since 
such conflicts by definition result in harm to some. Act-utilitarianism 
presented as a cost/benefit analysis readily lends itself to the "moral 
dilemma" loophole, whereas rule-teleology or rule-deontology do not. 

Application of Cost/Benefit Approach 

If a cost/benefit approach, is adopted, then the costs and benefits must 
both accrue to the subject. In medicine the cost/benefit analysis is a 
weighing of the likely benefits to the patient of a proposed plan of treat- 
ment versus the probable costs (risks) to the patient — financial, physical 
and emotional --of that form of treatment. Thus a physician may present a 
woman who has a diagnosis of breast cancer with alternative treatments for 
her consideration, including chemotherapy, lumpectomy and radical mastectomy. 

23-16 



It is questionable whether the physician has the right to determine for the 
patient the balance of risk over benefit of such alternative treatment plans. 
It is certain, however, that the physician is not morally privileged to pass 
judgment concerning the balance of risk to the patient versus the benefit to 
humankind by using the patient as a medical guinea pig to test these alter- 
native procedures. If she is to risk her personal welfare for the benefit of 
humankind the patient must, without qualification, have access to all available 
information concerning the effects of treatment in order that she, not the 
physician, may make that decision knowledgeably. The investigator who with- 
holds information in effect imposes his perspective upon the subject about 
what is good for humankind, therby repudiating the subject's right to his or her own 
informed perspective. 

Most experimentation with human subjects places them "at risk" in the 
sense that they are treated as passive things to be acted upon, means toward 
an end they cannot fully understand. An individual may choose to incur some 
degree of risk, inconvenience, or pain by becoming an experimental subject. 
To accept these risks knowingly for the sake of others may be an act of 
charitable concern or an expression of committment to the community. By 
agreeing to be a subject, a person to some extent relinquishes his sovereign 
will. When the subject accepts the research objectives and freely becomes a 
participant, he is rewarded by self-affirmation and social approval, much as 
is the scientist-participant. By serving an ideal such as progress, knowledge 
or human welfare, the subject and researcher accrue merit and a justified 
sense of self-enhancement. But a subject whose consent is obtained by deceit- 
ful and fraudulent means cannot recover his sovereign will. He remains instead 
a passive and obedient object for the experimenter to manipulate and is thus 
diminished rather than enhanced by his participation. Common law protects 



23-17 



individual freedom by proscribing manipulat ion of the psychological self. 
Possible benefits to mankind cannot justify legally (or morally) any exception 
to the requirement of full and frank disclosure to each person of all facts, 
probabilities and beliefs which a reasonable person might expect to consider 
before giving his or her consent. (See Mishkin's citation, 1975, p. 2, of 
Halushka vs. Uni vers? ty of Saskatchewan, 1965). 

Analysis of Costs of Deception 

The costs of deception have been greatly underestimated. These costs are 
ethical, psychological, scientific and societal. If harm is defined as death 
or permanent mental or physical disability, then, with rare exceptions, harm 
to the subject will not result from behavioral science research. The effects, 
harmful and beneficial, are more subtle in behavioral science research than 
in medical research. The costs and benefits to the subject and to society 
are in the realms of feelings, cognitions and values rather than in physical 
and material realms. 

I advocate the position that to intentionally deceive subjects or to 
obtain their consent fraudulently is to place them "at risk" even if they 
do not, as a result, experience additional stress or permanent harm. 

It is important to note that all the provisions of the DHEW and APA 
codes (including the right to informed consent) apply only after a subject 
has been determined to be at risk. Only after probable risk has been es- 
tablished must the investigator determine that "the risks to the subject are 
so outweighted by the sum of the benefit to the subject and the importance of 
the knowledge to be gained as to warrant a decision to allow the subject to 
accept these risks (Federal Register, 197*0 ■ It seems evident to me that 
the substantive rights of subjects should be guarded by the DHEW regulations 



23-18 



whether or not additional harm from the violation of these rights can be 
demonstrated. If there is objection to guaranteeing the rights of subjects 
in addition to their welfare, then I would argue that psychological invasion 
is itself injurious to the subject's welfare. I agree with Mishkin (1975, p. 2) 
that the law is moving "toward a model which protects against the invasion 
or manipulation of a person's psychological self." 

Inherent in this broadened perspective on legal liability in behavioral 
research is an increased sensitivity on the part of community leaders to the 
ethical problems raised in abusing a fiduciary relationship. To the extent 
that special privileges are accorded professionals and academics as an exten- 
sion of public confidence in their protective functions, a fiduciary relation- 
ship may be said to exist between this segment of the community and the rest. 
That is, the professional segment of the community in its relationship with 
the rest may be viewed as trustees of the values inherent in its activities — 
such values as integrity, compassion and trustworthiness. 

1. Ethical Costs of Deception . Any moral system which places preeminent 
value on humankind's reason and moral autonomy will allow few exceptions to the 
rule of informed consent. By moral autonomy is meant the right and obligation 
of each mature, healthy human being to assume personal responsibility for his 
actions. In accord with this view, the right of the subject to choose freely 
to participate in research is inviolable, not to be abridged by the investi- 
gator, although it may be waived by the subject. Doing research on people 
without their knowledge and informed consent is unethical under all circumstances. 
Principle 3 of the APA Code of Ethics reads: 

Ethical practice requires the investigator to inform the participant of 
all features of the research that reasonably might be expected to in- 
fluence willingness to participate, and to explain all other aspects 
of the research about which the participant inquires. (But then the 



23-19 



qualification:) The decision to limit this freedom increases the 
investigator's responsibility to protect the participant's dignity 
and welfare (1973, p. ^2) . 

Contained within each of these principles concerned with informed consent is 
a qualification which permits the principle to be violated although it is 
explicitly stated that ethically acceptable research requires establishment 
of a clear and fair agreement between the investigator and the research par- 
ticipant and that the investigator is obliged to honor all promises and com- 
mitments included in that agreement. But a subject who has been deceived as 
to the nature of his agreement cannot enter into a clear and fair agreement 
in the first place. These qualifications are not wrong because the subject 
may be exposed to suffering, but inequitable because a subject deprived of 
the right to informed consent has thereby been deprived of his right to decide 
freely and rationally how he wishes to invest his time and person. He has 
also been unjustly tricked into thinking his consent was informed when it was 
not. If as a result of the experimental manipulations the subject has in 
addition been entrapped into revealing to himself and others undesirable char- 
acteristics such as destructive obedience, dishonesty or sadism, he has truly 
relinquished more than he bargained for. Fundamental moral principles of re- 
ciprocity and justice are violated when the behavioral scientist acts to de- 
ceive or diminish those whose extension of trust is based on the expectation 
that persons to whom trust is accorded will be trustworthy in return. 

The experimenter by his deceitful actions violates the implicit social 
contract which binds experimenter and subject in which the subject assumes 
that the experimenter is both knowledgeable and trustworthy and that his code 
of ethics does not contain a "buyer beware" clause. Neither does the subject 
assume that the accumulation of knowledge has priority in the experimenter's 
hierarchy of values over decent treatment of the subject-participant. In 

23-20 



view of the special vulnerability, both personal and moral, which the subject 
invites by suspending disbelief and extending trust, the experimenter should 
agree to abide by a code of professional ethics more stringent, not less 
stringent, than his personal code. 

As Kelman (1967) notes, most of us in our interhuman relationships do not 
expose other to lies, deliberately mislead them about the purposes of an inter- 
action, make promises we intend to disregard or in other ways violate the re- 
spect to which all fellow humans are entitled. Yet we do so and feel justified 
in so doing in the experimenter-subject relationship. I have argued here that 
we ought to abide by a more, not less, stringent code of ethics in professional 
situations. Instead we justify our treatment of subjects solely as objects 
on the basis of our professional role. Thus we legitimize as well as commit 
ethical violations. The legitimization itself has harmful effects: it relieves 
the investigator of culpability and of the responsibility for devising non- 
deceitful alternatives or making reparation; it promotes false values that 
worthy ends such as the pursuit of truth justify unworthy means such as the 
use of deceit. 

2. Psychological Costs of Deception . Deceitful practices are most costly 
to the person when they have the following characteristics: 

a) The implicit or explicit contract between two persons is violated by 
one party (aggressor) without the consent of the other (victim) to the benefit 
of the aggressor and the detriment of the victim. 

b) The effect on the victim is to: (1) impair his ability to endow his 
activities and relationships with meaning, (2) reduce trust in legitimate 
authority, (3) raise questions about regularity in cause and effect relations, 
(4) reduce respect for a previously valued activity such a science, (5) neg- 
atively affect the individual's ability to trust his own judgment, or (6) 
impair the individual's sense of self-esteem and personal integrity. 

23-21 



c) The aggressor is respected by the victim and therefore could serve 
as a model . 

The effects upon subjects which I judge to be most harmful are those 
which result in cynicism, anomie, and hopelessness. In my view, the most 
injurious consequence that can befall a person is to lose faith in the pos- 
sibility of constructing for himself a meaningful life. Any experience which 
diminishes that faith inflicts suffering and possible harm. College students, 
who are the most frequently used subject pool, are particularly susceptible 
to conditions that produce an experience of anomie. 

I want now to illustrate the way in which I believe deceit and manipulation 
place subjects at psychological risk. My former secretary, Paula Lozar, de- 
scribed an incident which illustrates the way in which deception in an exper- 
imental setting can contribute to a young person's feeling of anomie as loss 
of faith in the meaningfulness of life. 

When I was 18, a sophomore in college, a psychologist from a nearby 
clinic came to my dormitory one evening and explained that he was 
looking for subjects for an experiment which involved simply telling 
stories about pictures which would be shown them. This sounded inter- 
esting, so I signed up. At the interview the same psychologist intro- 
duced me to a girl a few years my senior, who stayed bland and noncom- 
mittal throughout the time she interviewed me. She showed me a few 
pictures, and since they were extremely uninteresting I felt that the 
stories I was making up must be very poor. But she stopped at that 
point and told me that I was doing very well. I was gratified and said 
something to that effect before we went on to the rest of the pictures. 
Then I filled out a form about my reactions to the interview, the ex- 
perimenter, etc., and she took it and left. After being alone for a 
few minutes, I looked around the office and noticed a list of the last 
names of subjects, with "favorable" and "unfavorable" written alter- 
natively after each one. Shortly thereafter the male psychologist 
returned and said that, as I had guessed, what the interviewer had said 
had nothing to do with my performance. They were testing the effects of 
praise and dispraise on creative production, and he said so far they had 
discovered that dispraise had negative effects and praise seemed to have 
none at all. Since I expressed interest, he promised that the subjects 
would be given full results when they were tabulated. (But we never heard 
from him.) 

My reaction to the experiment at the time was mixed. 1 assumed that 
the deception was necessary to get the proper reaction from me, and that 



23-22 



since I had behaved unsuspiciously the results of the experiment were 
valid. However, I was embarrassed at having been manipulated into 
feeling pride at a non-achievement and gratification at praise I didn't 
deserve. . . Since in my early years in school I had alternated between 
being praised for doing well and being damned for doing too well, I had 
always been a poor judge of my own achievements and had no internal 
standards for evaluating my performance--al though I knew I was very in- 
telligent and felt that some sort of moral flaw kept me from doing as 
well as I might. At the time, I was attending a second-rate college 
and felt (rightly) that my grades had nothing to do with how well I was 
really doing relative to my ability. This experiment confirmed my con- 
viction that standards were completely arbitrary. Furthermore, for 
several years I had followed a pattern of achievement in which I would 
go along for quite a while doing well in classes, interpersonal relations, 
etc. Then I would have a moment of hubris in which I was more self- 
confident or egotistical than it behooved me to be in that situation. At 
this point someone would cut me down to size; I would be totally devastated, 
and it would take me a long time to work myself up to my previous level 
of performance. The experiment had, in a lesser degree, the same effect 
upon me, and it . . . confirmed me in this pattern because the devastating 
blow was struck by a psychologist, whose competence to judge behavior I 
had never doubted before. . . It is not a matter of "belief" but of 
fact that I found the experience devastating. I told literally no one 
about it for eight years because of a vague feeling of shame over having 
let myself be tricked and duped. It was only when I realized that I 
was not peculiar but had, on the contrary, had a typical experience that 
I first recounted it publicly. . . 

At the time of the experiment, I had arrived at a position common 
to young adults who have lost confidence in external standards, either 
ideals or authorities, as a guide to how to live, and was in the process 
of formulating my own standards. As a result of my early lack of self- 
confidence and inconsistent school experiences, my task had been laborious 
and not entirely successful. . . The experiment confirmed me in my lack 
of success. 1 had been led into a situation where I was explicity told 
to disregard my own interpretation of what was going on and made to per- 
ceive it another way, and then eventually told that both ways I had per- 
ceived it were wrong. . . The result was to further convince me that my 
perceptions were useless as a guide for action, and that, since the only 
person I felt I could trust--mysel f--was not trustworthy, I had no way of 
judging how to act and hence it was better not to act at all . . . 

I was harmed in an area of my thinking which was central to my per- 
sonal development at that time. To me, and to most of my classmates, the 
task of setting one's own standards, of formulating guides to living. . . 
was one of the most important tasks we faced. This had to do with . . . 
one's ability to give meaning to one's life. I rather suspect that many 
of us who volunteered for the experiment were hoping to learn something 
about ourselves that would help us to gauge our own strengths and weak- 
nesses, and formulate rules for living that took them into account. Some- 
thing of the sort was, I know, in the back of my own mind. When, instead, 
I learned that I did not have any trustworthy way of knowing myself — or 
anything else--and hence could have no confidence in any lifestyle i 
formed on the basis of my knowledge, I was not only disappointed, but felt 
that I had somehow been cheated into learning, not what I needed to learn, 
but something which stymied my very efforts to learn. 1 



1 Lozar, P. Personal communication, 1972. 

23-23 



Ms. Lozar thus describes the serious effects she felt this deception had 
on her, and they are precisely the kinds of effects which I designated 
earlier as "most costly." Yet many investigators regard none of these 
effects as real, demonstrable or serious. Whose criteria concerning psy- 
chological costs are to be adopted? 

3. Scientific Costs of Deception . The scientific costs of deception 
in research are considerable. These costs include: a) exhausting the pool 
of naive subjects, and b) jeopardizing community support for the research 
enterprise. If these costs are real it will become increasingly difficult 
to do valid research; we may be damaging chances for others to work in the 
same locations or on the same problems. This harm may be irreversible. 

a) Exhausting the pool of naive subjects. In the experimental 
situation, the investigator must assume that subjects accept the reality of 
the situation as defined by the experimenter, or that if subjects fail to do 
so that the investigator knows this. But there is increasing reason to doubt 
that subjects are indeed naive. As a result of widespread use of deception, 
psychologists are suspected of being tricksters. Suspicious subjects may 
respond by role-playing the part they think the experimenter expects, doing 
what they think the experimenter wants them to do (Orne, 1962) or pretending 
to be naive. 

Wahl (1972) has summarized the growing body of evidence that deception 
in psychological research is not effective and subjects are not naive. Wahl 
documents his assertions that it is neither theoretically nor practically 
defensible to assure subject naivete by deception, and that experimental 
realism obtained through situational deception is not necessarily more suc- 
cessful than the realism of deception-free situations. Moreover, Wahl concludes 
from his review that experimenters cannot distinguish subjects for whom 

23-24 



the deception promotes experimental realism from subjects who merely 
pretend to be fooled. If the widespread use of deceit has decreased the 
likelihood that subjects will be naive, as Wahl's survey suggests, such 
practices are obviously counterproductive. If the sample to be used in any 
given study is biased already (such as jaded lower-division psychology 
students), then the argument that informed consent may be dispensed with in 
order to assure an unbiased sample becomes unconvincing. It is an essential 
part of the teaching responsibility of professors to feed research findings 
back to the student subject pool through lectures and articles. This cum- 
ulative knowledge is then passed on to successive generations of students. 
Undergraduate psychology students, the most frequently sampled of all pop- 
ulations, are of necessity sophisticated. We must assume that their knowledge 
interacts with experimental conditions to produce results which may not be 
replicable in the general population. 

Any population subject to behavioral science research will be similarly 
affected. Thus Brody (196?) found that almost all members of a delinquent 
sample chose delayed reward under experimental conditions while a normal sample 
selected both delayed and immediate reward under the same conditions. The 
delinquent sample findings were contrary to predictions and led to further 
probings which revealed that similar research had been recently conducted in 
that institution so that the subjects were not naive. 

b) Jeopardizing community support for the research enterprise. The 
power of the scientific community is conferred by the larger community. Social 
support for behavioral science research may be jeopardized by investigator's 
encapsulation within parochial values if these values conflict with more 
universal principles of moral judgement and moral conduct. The very existence 
of the Commission suggests that the use of unethical research practices has 

23-25 



jeopardized community support. Congress no longer appears to trust the 
professional associations to police themselves. We really do not know 
the public's attitude today towards the scientific enterprise. We should, 
so that these attitudes can be considered when formulating ethical codes, 
so that investigators will be more aware of their responsibility to con- 
stituents and supporters, and thus to the community at large. 

k. Societal Costs of Deception . Social science research through its 
methods and substantive findings has widespread political and social effects. 

It can be argued that by its very nature social science research is 
a political act. In the research endeavor, certain participants (those in 
charge), are in power with control while other participants (the subjects), 
are defined as "objects" of assessment. Frequently these "objects" are exposed 
to the investigator's values in a highly coercive situation. For examplejirost 
investigators embrace an ideology of individualism which knowingly or unknow- 
ingly they impose on subjects. What the subject may have thought of as cooper- 
ation may be labelled destructive obedience as in the Milgram situation, or 
what he may have thought of as social cooperation is labelled as external 
locus of control (e.g., by Rotter). The use of fraud and deceit while the 
subject is in a heightened state of suggestibility, as he is when truly 
naive, should be thought of as increasing the risk that the subject will intern- 
alize at least temporarily the investigator's values, even if these are anti- 
thetical to his own. The investigator may be convinced of the rectitude of 
his values (as Milgram is) but does he have the right to impose his values on 
subjects (as Milgram does)? 

The scientific justification for using deception is to assure subject 
naivete. But it is the naive subject who is disproportionately placed "at 
risk" by the use of deceit and fraud because he risks disillusion and brain- 
washing. The sophisticated subject, already suspicious, is merely confirmed 

23-26 



in his cynicism by deceitful practices. But to the extent that subjects 
are sophisticated, the experimental deception has failed to increase the 
scientific or social benefit of the experiment. Thus it would appear that 
deception is least justified ethically when it is most successful. 

If praxis in the laboratory or natural setting cannot be isolated from 
praxis in daily life, the implications are far-reaching. If subjects learn 
they cannot trust those whom by social contract are designated trustworthy 
and whom they need to trust to avoid feeling alienated from society, then the 
damage done to the subjects and to society by the enacted values of researchers 
is very real . 

Subjects are given objective reasons to distrust authorities in whom they 

should have confidence, and apparently they are affected by this experience. 

For example: 

Fillenbaum (1966) found that deception led to increased suspiciousness 
(even though subjects tended not to act on their suspicions), and 
Keisner (1970 found that deceived and debriefed subjects were "less 
inclined to trust experimenters to tell the truth," (p. 7)- Other 
authors (Silverman, Shulman, and Wiesenthal, 1970; Fine & Lindskold, 
1971) have noted that deception decreases compliance with demand charac- 
teristics and increases negativistic behavior. (James M. Wahl , 1972, 
p. 12) 

Ring, Wallston and Corey (1970), in their follow-up interview exploring 
subjective reactions to a Milgram-type obedience experiment reported that many 
subjects stated that they were experiencing difficulty in trusting adult authori- 
ties. In most of these studies mild and non-threatening deceptions were used, 
so that one speculates about the possible unknown lasting corruption of trust 
resulting from more severe deceptions. 

Truth-telling and promise-keeping serve the function in social relations 
that physical laws do in the natural world; these practices promote order and 
regularity in social relations, without which intentional actions would be very 
nearly impossible. By acting in accord with agreed-upon rules, keeping promises, 
acting honorably, following the rules of a game, human beings construct for 

23-27 



themselves a coherent, consistent environment in which purposive behavior 
becomes possible. Animals, other than man, have limited capacity for manipu- 
lations and feints. Humankind may unnecessarily complicate their quest for 
survival by employing deceit and manipulation as an accepted part of a valued 
activity. 

I believe that it is good for people to place a value on the activities 
of behavioral scientists and on the values inherent in scientific activity. 
The disciplined exercise of intelligence in science or art is of value in it- 
self and this value does not depend upon the betterment of the material aspects 
of life to which it rightfully leads. if the rule which justifies scientific 
experimentation is "You shall know the truth and the truth shall set you free," 
then that rule applies also in the conduct of science. The use of the pursuit 
of truth to justify deceit risks the probable effect of undermining confidence 
in the scientific enterprise and in the credibility of those who engage in it. 

Analysis of Benefits of Deception 

There can be societal benefits to the use of deception only if there are 

probable scientific benefits associated with its use that are obtainable in 

no other way. The basic rationale for the experimental method is contained in 

the revised code of ethics of the APA as part of its justification of a cost/ 

benefit analysis. 

Not only do ethical questions follow from the psychologist's pursuit 
of important independent and dependent variables but the methods that 
are adequate to make inferences as unambiguous as possible tend to be 
the ones that raise ethical difficulties. Many psychologists believe 
(though some question this) that to obtain valid and general izable data, 
it is often essential that the research participants be naive. The re- 
quirements of research may thus seem to demand that the participants be 
unaware of the fact that they are being studied or of the hypotheses 
under investigation. Or deception may appear to be necessary if a 
psychological reality is to be created under experimental conditions 
that permit valid inference (1973, PP- 8-9)- 

Many scientists are calling into question the implications contained 

23-28 



in the above statement, (e.g., Chein (J972J, Guttentag lJ97l\, Harre and 

Secord fj972], Kelman [ 1966 J, and Orne JI962]). Schultz concludes in his 

critical examination of the history of human experimentation 

... that psychology's image of the human subject as a stimulus-response 
machine is inadequate and that many studies are based on data supplied 
by subjects who are neither randomly selected nor assigned, nor rep- 
resentative of the general population, nor naive, and who are suspicious 
and distrustful of psychological research and researchers (1969, P- 21*f). 

As a number of critics including Brandt, Guttentag, Mixon and myself 
have pointed out, the ecological validity of studies widely acclaimed for their 
sceintific merit is so questionable as to raise serious objections concerning 
the benefit to society of generalizations based on these findings. A case in 
point is the Milgram study (1963): 

The following is Milgram's abstract of his experiment: 

This article describes a procedure for the study of destructive obedience 
in the laboratory. It consists of ordering a naive S to administer increas- 
ing more severe punishment to a victim in the context of a learning experi- 
ment. Punishment is administered by means of a shock generator with 30 
graded switches ranging from Slight Shock to Danger: Severe Shock. The 
victim is a confederate of E. The primary dependent variable is the maxi- 
mum shock the S is willing to administer before he refuses to continue 
further. 26 Ss obeyed the experimental commands fully, and administered 
the highest shock on the generator. 14 Ss broke off the experiment at 
some point after the victim protested and refused to provide further answers. 
The procedure created extreme levels of nervous tension in some Ss. Profuse 
sweating, trembling, and stuttering were typical expressions of this emo- 
tional disturbance. One unexpected sign of tension--yet to be explained-- 
was the regular occurrence of nervous laughter, which in some Ss developed 
into uncontrollable seizures. The variety of interesting behavioral dynam- 
ics observed in the experiment, the reality of the situation for the S, 
and the possibility of parametric variation within the framework of the 
procedure, point to the fruitfulness of further study (p. 371) . 

The fundamental question Milgram asks is "how does a man behave when he is 

told by a legitimate authority to act against a third individual?" (p. 851) 

Milgram generalizes his findings to apply to the actions of men in combat and 

guards in Nazi concentration camps. According to Milgram, "within the general 

framework of the psychological experiment obedience varied enormously from one 

condition to the next." (p. 851) Well, then, to what social conditions does 

23-29 



the laboratory condition reported (1963) have generality? The experimenter's 
directive to dangerously shock the victim is, on the face of it, inappropriate 
in a psychological setting, and perhaps bizarre. A specialist in the science 
of psychology is expected to display compassion and personal integrity, so 
that such an order "to act harshly and inhumanely against another man" (p. 852) 
is incongruous. There is nothing incongruous about that order in a setting 
such as military combat. An officer and a psychologist are quite different 
kinds of authorities. The superior officer is an authority in the sense 
that he can require and receive submission and is authorized by the state 
to command obedience and is given the power to control and punish subordinates 
for disobedience. A psychologist relating to a subject or client has the 
authority of a specialist in a given field whose statements in that area can 
reasonably be considered authoritative. His area of legitimate authority 
rests not on power to punish, but upon trust extended by the subject or patient 
and based on the psychologist's claim to wisdom, knowledge, and professional 
integrity. Both the enlistee and the subject assume an integral aboveboard 
relationship not based on personal gain in the narrow sense. But the similarity 
cannot be pushed much further provided that normal conditions prevail. The 
military officer who orders enlisted men to fire upon the enemy engenders in 
their minds a very different kind of conflict, if any conflict at all, than 
the conflict engendered by the psychologist when pressing the subject to severely 
shock the victim. The officer's order to fire upon the enemy is patently ap- 
propriate to the situation. If the officer ordered the enlisted man to fire 
upon comrades further up front in order to prod them forward in the common cause, 
that condition might indeed be likened to the experimental condition. There 
are situations like that for which Milgram's condition is valid, but they are 
not a part of normal social life as he suggests. 

23-30 



The dissonant demands made upon the subject in a laboratory setting 
might reasonably produce a sense of unreality and absurdity quite different 
from that experienced in any normal setting. While in a state of confusion 
brought about by this unique juxtaposition of cues, the subject is urged to 
act. Disobedience in this setting is as likely to reflect flight and indecision, 
or fight against authority, as a moral decision to refrain from hurtful action. 
Obedience is as likely to reflect a sense of fair play and employee loyalty 
as a lack of moral sense or weakness of character. 

Mixon (197^) repeated Milgram's experiment in an effort to understand 

the contexts in which subjects obey and disobey. 

. . . I found that when it became perfectly clear that the experimenter 
believed the "victim" was being seriously harmed all actors indicated 
defiance to experimental commands (Mixon, 1972). Briefly summarized, 
the All and None analysis suggests that people will obey seemingly in- 
humane experimental commands so long as there is no good reason to think 
experimental safeguards have broken down; people will defy seemingly in- 
humane experimental commands when it becomes clear that safeguards have 
broken down--when consequences may indeed be what they appear to be. 
When the experimental situation is confusing and mystifying as in Milgram's 
study, some people will obey and some defy experimental commands. (pps. 80-81) 

Another explanation for the behavior of Milgram's obedient subjects is offered 

by Brandt. 

Had Milgram considered himself as just another human being from whose 
behavior something can be learned about human behavior in general. . . he 
would have known that human beings can inflict suffering on other human 
beings, if they can rationalize their behavior. Self-examination could 
have told him so. (1971, p. 237) 

When "subjects" are viewed by the experimenter in the dictionary 
meaning of the word, the authoritarian relationship can lead the exper- 
imenter to consider their behavior as "obedience." The implicit assumption 
is then made that experimental psychologists differ from human experimental 
subjects to suah an extent that similar overt behavior by the two groups 
cannot be assumed to result from similar covert causes (motivations, needs, 
drives, etc.). This distinction between experimenters and subjects is 
evidenced by explaining similar behavior of the two in terms of dis- 
similar motivations. In Milgram's experiments both experimenter and sub- 
jects inflict pain on others. This infliction of pain on others is explained 
by Milgram as "obedience" when done by the subjects and as "examining 
situations in which the end is unknown" (196 i fb, p. 8^8) when done by the 
experimenter. (1971, p. 239) 

In the Milgram experiment, the presence of the experimenter sanctioned 

23-31 



aggressive behavior on the part of the subject as Milgram's authority sanc- 
tioned the aggressive behavior of the experimenter and stooge. 

Holland ( 1 968) , focusing his analysis on the deception manipulation, 
demonstrated with three experiments that a high percentage of Milgram's 
subjects probably detected the deception without Milgram's knowledge. Mixon 
(1972) argues that subjects may always be expected to suppose the existence 
of at least minimal precautions safeguarding the physical well-being of sub- 
jects, and that therefore the judgment that Milgram's obedient subjects 
behaved in a "shockingly immoral" fashion is quite gratuitous. The general- 
ization of Milgram's findings to real life condi tions--people will comply 
with an imperative whose effects they believe harmful to another non-threatening 
individual — is not as self-evident as many social psychologists seem to think. 

I have questioned in some detail the scientific validity of Milgram's 
research because it is frequently cited as an example of using deception in 
which the scientific and social benefits are very great, even if they do not 
outweigh the costs to the subject. I have tried to show, however, that 
Milgram's procedures are not only ethically unjust if iable--whatever their 
presumed benefits — but also, from a strictly scientific point of view, incon- 
clusive. Far from studying real life in the laboratory as he thought he was, 
Milgram in fact may have constructed a set of conditions so internally in- 
consistent that they could not occur in real life. 

Many critics of the experimental method believe that laboratory studies 

typically preclude ecological validity. Thus Guttentag (1971) states 

Although the classical model holds sway in psychology, there are 
a number of issues which continue to be raised about it and the logic 
of statistical inference with which it is associated. . . The indepen- 
dence of the subject and the experimenter is difficult to assume in 
much research. . . Another problem is the experimenter's assumption 
of an essential independence and neutrality of each subject unit; i.e., 
that human beings are interchangeable. . . Although the logic of 

23-32 



experimentation and of statistical inference requires the assumption, 
one may still question whether it is a tenable one . . . 
• • even when the individuals from such populations are randomly as- 
signed to experimental conditions; given that people live within social 
systems, there is no logical guarantee that some condition which affects 
all subjects uniformly, a condition unknown to the experimenter, is 
not interacting with the experimental variables to produce a particular 
set of findings (1971, pp. 80-81). 

The rigorous controls which characterize the laboratory setting may 
prevent generalizations to the free social environment. The extent to 
which one may generalize from behavior observed in the laboratory to the 
life situation is negatively related to the change which containment and 
control produce in that behavior. While the subject is familiar with the 
individuals, setting, and stimuli in his natural environment, he is unfamiliar 
with those in the laboratory setting. His reactions to the novel, or to the 
familiar in incongruous settings will affect his behavior. The power relations 
are qualitatively different in the experimental setting. There, the exper- 
imenter is the controlling party and the subject is an object of control. The 
two are in an authoritarian relationship in a setting unfamiliar to the subject. 

We know that ambiguity of causal inference is an inherent part of research 
in the social sciences. Yet we continue to act as if the perfect experiment 
is just around the corner and, but for our ethical scruples, we would readily 
reach that scientific millenium. 
Nature and Definition of Informed Consent in Field Research 

As skepticism has increased concerning the veridical ity of subjects' 
behaviors in experimental studies, particularly in personality and social psych- 
ology, the use of naturalistic experimentation and naturalistic observation has 
grown. In naturalistic experimentation the investigator intervenes to affect 
the normal behavior of the person observed while in naturalistic observation 
he does not. In either instance participants may be unaware that they are 



23-33 



participating in research at the time the data are collected. That research 
activity is occurring is concealed in a number of ways, including covert 
observation and recording of public behavior, obtaining information from 
third parties, disguised field experimentation and covert manipulations. 

Silverman (1975) provides us with the following synopsis of prototypic 
naturalistic experiments: 

1. Persons selected at random are phoned. The caller pretends that 
he has reached a wrong number, using his last piece of change, and that 
his car is disabled on a highway. The party is requested to phone the 
caller's garage and ask them to come for him. The garage number is 
actually the caller's phone and another experimenter, standing by, pre- 
tends to take the message (Gaertner 6 Bickman, 1972). 

2. Automobiles, parked on streets, look as if they were abandoned. 
(License plates are removed and hoods are raised.) Experimenters hide 
in nearby buildings and film people who have any contact with the cars 
(Zimbardo, 1969). 

3. People sitting alone on park benches are asked to be interviewed 
by an experimenter who gives the name of a fictitious survey research 
organization that he claims to represent. At the beginning of the 
interview, the experimenter asks a person sitting nearby, who is actually 
a confederate, if he wouldn't mind answering the questions at the same 
time. The confederate responds with opinions that are clearly opposite 
those of the subject and makes demeaning remarks about the subject's 
answers; for example, "that's ridiculous"; "that's just the sort of 
thing you'd expect to hear in this park" (Abelson 6 Miller, 1967). 

k. The experimenter comes to a home, says that he has misplaced the 
address of a friend who lives nearby, and asks to use the phone. If 
the party admits him, he pretends to make the call (Milgram, 1970). 

5. A female and a confederate experimenter visit shoe stores at 
times when there are more customers than salesmen. One of them is 
wearing a shoe with a broken heel. She rejects whatever the salesman 
shows her. The confederate, posing as a friend of the customer, sur- 
reptitiously takes notes on the salesman's behavior (Schaps, 1972). 

6. Housewives are phoned. The caller names a fictitious consumers' 
group that he claims to represent and interviews them about the soap 
products they use for a report in a "public service publication," which 
is also given a fictitious name. Several days later the experimenter 
calls again and asks if the housewives would allow five or six men into 
their homes to "enumerate and classify" all of their household products 
for another report in the same publication. If the party agrees, the 
caller says he is just collecting names of willing people at present 
and that she will be contacted if it is decided to use her in the sur- 
vey. No one is contacted again (Freedman & Fraser, 1966). 

7- A person walking with a cane pretends to collapse in a subway car. 
"Stage blood" trickles from his mouth. If someone approaches the victim, 
he allows the party to help him to his feet. If no one approaches before 
the train slows to a stop, another experimenter, posing as a passenger, 
pretends to do so and both leave the train (Piliavin & Piliavin, 1972). 



23-34 



8. One experimenter takes a seat next to someone sitting alone in 
a subway car. Another experimenter approaches the person sitting next 
to the first experimenter and asks if the train is going downtown. The 
first experimenter intercedes before the party has a chance to answer 
and gives the wrong information. The second experimenter thanks him 
and takes a seat nearby (Allen, 1972). 

9- Letters, stamped and addressed to fictitious organizations at the 
same post office box number, are dropped in various locations, as if 
they were lost on the way to being mailed. Some are placed under auto- 
mobile windshield wipers with a penciled note saying "found near the 
car." (For one study with this procedure, the permission of the Post 
Office Department was obtained to use the names of fictitious organi- 
zations; Milgram, I969.) 

10. Experimenters, walking singly or in pairs, ask politely for either 
10< or 20<f from passersby, sometimes offering an explanation for why 
they need the money (Latane, 1970). (p. 765) 

Both Nash's comment on that paper (Nash, 1975) and Mishkin's later 
paper (1975) expound the concept of injury to include protection of the psycho- 
logical self. These papers point out that case law now includes deceit, inva- 
sion of privacy and violation of civil rights in the concept of liability. If 
the research activities summarized by Silverman violate one or more of these 
values, investigators may be considered to have abused a fiduciary relation- 
ship, ethically if not legally. 

In a recent popular presentation entitled Snoopology (1975), John Jung 
discusses some probable effects of experimentation in real-life situations 
with persons who do not know they are serving as experimental subjects. These 
include: increased self-consciousness in public places, broadening the aura of 
mistrust and suspicion that pervades daily life, inconveniencing and irritating 
persons by contrived situations, desensitizing individuals to the needs of others 
by "boy-who-cr ied-wol f" effects so that unusual public events are suspected of 
being part of a research project. 

At present a strong case can be made for the scientific value of field 
research using inobtrusive observation. But as the frequency of naturalistic 
experimentation increases, the usefulness of these procedures is bound to 



23-35 



decrease. 

Referring to laboratory research, Seeman concluded (1969, p. 1026), 
"In view of the frequency with which deception is used in research we may 
soon be reaching a point where we no longer have naive subjects, but only 
naive experimenters. It is an ironic fact that the use of deception, which 
is intended to control the experimental environment, may serve only to con- 
taminate it." In the long run this same argument will be applicable to natur- 
alistic research. Referring to naturalistic experimentation, Jung concludes 
(p. 58) "psychologists are contributing toward their own downfall by estab- 
lishing a credibility gap between themselves and the public. And the ensuing 
aura of mistrust and suspicion that would pervade daily life would be a high 
price to pay." Any research paradigm that precludes the right of the subject 
to give informed consent and exercise his right to receive an explanation and 
clarification of research findings may be in the long run self-defeating, as 
wel 1 as unethical . 

In summarizing the few public opinion surveys on computers, privacy and 
record-keeping, Westin and Baker (1972, p. *»68) state that "privacy-related 
issues are a matter of solid minority concern." About one-third of the res- 
pondents were distressed by what they felt was an erosion of their right to 
privacy. The public is aware of and appreciates the legitimate needs of govern- 
ment and industry for information, but Westin concludes (p. 388), and I agree, 
"that this would be a bad moment in our national history to adopt such a 
policy." There is in this nation today a high level of distrust concerning 
government surveillance and people fear that where such surveillance by govern- 
ment, industry or science is tolerated, repressive action might be directed 
against citizens. In countries such as Sweden, Norway and Israel, where such 
distrust does not prevail, privacy is not seen as an important manifestation 

23-36 



of civil rights. For many citizens and their government representatives 
in the United States, however, naturalistic observation and experimentation 
present the same danger as a citizen numbering system, databanks, and wide- 
spread psychological testing of school children; all these forms of inobtrusive 
surveillance are felt to violate individuals' rights to privacy and "inviolate 
personality," rights that can be waived but not abused ,even by research inves- 
tigators. 

Legal scholars (e.g., Miller, 1971; Westin & Baker, 1972) encouraged by 
appropriate Senate subcommittees (e.g., Administrative Practice and Procedure, 
and Constitutional Rights) have been examining the computer-privacy question 
at least since 1967 when the National Data Center was proposed. These inter- 
ested parties continue to urge lawmakers to consider the new information tech- 
nologies and the effects computers may have on individual privacy in contemp- 
orary life. We may expect these watchdogs to continue monitoring evidence of 
the individual's loss of control over personal information, including unwanted 
intrusion through naturalistic experimentation in public or private places. 
Strategies for Resolving Problems Associated with Use of Deception 

Strategies deemed appropriate for resolving problems associated with 
deceitful practices depend upon the metaethical orientation one adopts toward 
the use of deception. From a ut i 1 i tar ian (cost/benef i t) approach, deception 
is appropriate if the benefits outweigh the costs. Therefore, one may decrease 
the costs either a) by debriefing, and/or b) by avoiding unacceptable forms of 
deception (as determined by public opinion polling). Alternatively, one may 
increase the benefits to the subject a) by treating him with the respect due 
to a collaborator, and/or b) by reimbursing him with financial or other rewards. 
The absolutist approach rejects all justification of deception and requires the 



23-37 



investigator to develop new methodologies that do not require deception. 
All these strategies for dealing with the ethical problems associated with 
deceptive research practices will now be considered. 

Decreasing Costs by Debriefing 

The purpose of debriefing in research involving deception is to correct 

subjects' induced mispercept ions about their own and others' performance and 

to reestablish conditions of trust in the professional relationship. There 

is some question as to whether even the most effective debriefing can reverse 

these undesirable aftereffects of deception procedures. According to section 

8-9 of the APA Code of Ethics: 

The investigator has the obligation to assure that research participants 
do not leave the research experiencing undesirable aftereffects attri- 
butable to their participation. Such negative consequences can arise 
if the participants are permitted to remain confused or misinformed about 
important aspects of the study or, more serious still, if steps are not 
taken to remove effects of psychological stress or other painful conse- 
quences resulting from research participation. 

But as Seeman (1969, P- 1027) points out: 

When a person is told that he has been deceived, he may quite conceivably 
be confused as to when the deception had really taken place. Since he 
will quite appropriately have lost confidence in the person's veracity, 
the subject may never be able to disentangle the times of truth and the 
times of falsity in his relationship to the experimenter. 

For example, in the Milgram experiment, debrief ing would not reinstitute the 
subject's self-image or his ability to trust adult authorities in the future. 
The subject did after all commit acts which he believed at the time were harmful 
to another, and he was in fact entrapped into committing those acts by an indi- 
vidual whom he had reason to trust. 

It is my observation that investigators concerned about the effects of 
revealing deceptive practices are increasingly opting for leaving the subject 
uninformed or misinformed. 

In my view the investigator must forego the opportunity to engage in 



23-38 



research that permits only two possible alternatives: deceptive debriefing 
(in which the truth is withheld from the subject because full disclosure 
would lower the subject's self-esteem or affect the research adversely); or 
inflicted insight (in which the subject is given insight into his flaws, 
although such insight is painful to him and although he has not bargained for 
such insight). In section 8-9 of the APA Code of Ethics concerning the obli- 
gation of the investigator to remove misconceptions about the subject him- 
self or his performance in the experiment, whether these misconceptions have 
been deliberately or unintentionally induced, the question is asked but not 
answered: "Must the investigator correct misinformation or provide missing 
information even when this will be distressing to the participant?" (1973, P- 76) 
The situation, as I see it, is this: the investigator, to further his own 
end (i.e., to do worthy research as efficiently and effectively as possible) 
contrives a predicament for himself where, as he sees it, he must choose be- 
tween two equally unacceptable alternatives in his treatment of subjects, that 
is, deceptive debriefing or inflicted insight. The solution to this "dilemma" 
is simple. The investigator need only reject his original experimental design 
as unethical on the grounds that it allowed him only two alternatives, both 
morally unacceptable (i.e., that it placed him in a moral dilemma). He can then 
proceed to invent another and more ethically acceptable design. No experimental 
procedure anticipated by the investigator to require deceptive debriefing in 
order to guard the subject's self-esteem or mental health ought to be considered. 
For deceptive debriefing violates the subject's fundamental rights to have mis- 
conceptions removed subsequent to the experiment and to receive honest (although 
not necessarily complete) feedback concerning the findings of the experiment. 
The investigator's duty is clear. Just as he may not intentionally design an 



23-39 



experiment in which it is necessary to kill or maim the subject to facili- 
tate effective and efficient research, so he may not design an experiment 
in which it is necessary to deceptively debrief a subject. 

Concerning second order deception (i.e., deceptive de-debriefing), 
Kelman (1967) states, 

Such a procedure undermines the relationship between experimenter and 
subject even further than simple misinformation. . .deception does not 
merely take place within the experiment, but encompases the whole def- 
inition of the relationship between the parties involved. Deception 
that takes place while the person is within the role of subject for 
which he has contracted can, to some degree be isolated, but deception 
about the very nature of the contract itself is more likely to suffuse 
the experimenter-subject relationship as a whole and to remove the 
possibility of mutual trust, (p. 2) 

Some, but not all, of the above objections to debriefing can be met 
provided that the investigator takes seriously his responsibility to offer 
subjects a reparational experience. Aronson and Carlsmith (1968) point out 
that debriefing requires considerably more than blatant exposure of the truth; 
subjects' reactions are in part a function of the experimenter's tact and 
consideration. The experimenter can express^ his own discomfort at using 
deception and explain in detail its necessity and the care that went into 
making the procedure believable, thus reducing the subjects' concerns about 
being found gullible. To the extent that subjects are permitted to gradually 
work out the truth for themselves, these writers believe that they will feel 
less victimized. 

Mills (in press) emphasizes that the clarification procedure or debrief- 
ing may itself have harmful effects unless conducted with great sensitivity. 
He presents in great detail a debriefing procedure, including a scenario, 
which he developed over 20 years of debriefing and which he believes can be 
adapted to explain any experiment using deception. The advantages of the 
scenario are that the investigator is required to put a great deal of care 



23-40 



and thought into his presentation; he can proceed confidently, covering all 
necessary points so that the participant is provided with an educational 
experience as well as a truthful account of the experiment's actual nature. 
The experiment is explained very gradually and every point reviewed until 
the subject understands. The subject is then given time to reorganize his 
perception of the experiment and his responses to it, from possible humili- 
ation and discomfort to self-acceptance and hopefully sympathetic understanding 
of the researcher's perspective. Certainly investigators, if they use decep- 
tion, should be required to show subjects the respect inherent in Mills' 
scenario. It should be noted, however, that the script leaves no room for 
the subject to object to the morality of the deception and, indeed, makes it 
difficult for him to do so by providing such an air-tight rationalization for 
its use. For reactive subjects concerned with personal agency this could be 
quite offensive. But in most instances I would agree that such extremely 
careful and considerate debriefing could substantially reduce the costs of 
deception and increase the benefits to the subject of his participation. 

Decreasing Costs by Polling the Public 

Many social science investigators claim that most prospective subjects 
would not in fact object to the use of deception were they given a chance to 
vote on the issue. 

There is an important sense in which polling the public does decrease the 
societal costs of the use of deception. By informing the public of the issues, 
polling actually promotes a sense of self-determination for the group as a 
whole, if not for each individual. 

There are in fact a few studies which explore the question of how subjects 
feel about deception. For example, Sul 1 ivan and Deiker (1973) surveyed a random 



23-41 



sample of 400 members of the APA and 357 undergraduate psychology students 
to determine which group most harshly judged deception. Not surprisingly, 
more of the psychologists felt that deceptive practices were unethical than 
did the students. Given the greater maturity of adult judgment this would 
be expected. (The moral to be drawn from this study, in my opinion, is not 
that the use of deception is ethical but rather than undergraduate psychology 
students are still in need of ethical guidance.) More studies with other popu- 
lations are needed. 

I recommend, therefore, that where investigators plan to use deceit or 
where informed consent cannot be obtained, representative samples of people 
be matched with the individuals to be investigated to serve as peer consul- 
tants and to review the proposed experimental or observational procedures. 
These peer consultants, selected in the same manner as public opinion poll 
respondents, could assist investigators in identifying ethical problems and 
serves as informants to evaluate the effects of deception. 

The public should know the kinds of risks a volunteer subject may expect 
to undergo. While in a general sense subjects would be less naive as a result 
of a publicity campaign, their set might also be more standardized and their 
behavior less suspicious in a given experimental situation. The cat and mouse 
element is reduced when subjects are encouraged to act "as if" the experimental 
instructions are straightforward. Investigators would realize that a "naive" 
subject is one who has agreed to suspend disbelief rather than one who presum- 
ably has been fooled into believing duplicitous instructions. 

Increasing Benefits to Subjects 

The investigator's indebtedness to subjects should be expressed in materi- 
al payment and in focussed attention to the subject as a human being. The 
investigator seldom perceives in positive terms his indebtedness to the subject, 



23-42 



perhaps because the detachment which he thinks his function requires prevents 
appreciation of the subject as a person. Yet a debt does exist, even when 
the subject's reason for volunteering includes course credit or monetary gain. 
Particularly where experimental conditions expose the subject to loss of dig- 
nity or offer him nothing of intrinsic value, the experimenter is obliged to 
reward the subject with something the subject values. In addition to material 
rewards, the experimenter should make time to express his appreciation to the 
subject, answer his questions in detail, assure him that he did well, and ex- 
change amenities. Subjects should be the first recipients of knowledge gained 
from the project—knowledge specifically about themselves and then about the 
questions the research is designed to answer. If a subject is seeking an oppor- 
tunity to have contact with and confide in a person with psychological training 
these personal needs also should be met. To the extent that it is possible, 
subjects should be actively involved as collaborators in ongoing research. I 
will quote Eisner's excellent treatment of the debt owed to the subject and 
the way in which this debt can be repaid. 

The social status of each subject renders him powerless within the 
research setting. Furthermore, the fact that experiments are carried 
out, for the most part, in the experimenter's laboratory, with his 
equipment, according to his rules, combined with the prestige and recog- 
nized expertise of the experimenter, further contributes to the power 
deficiency of the subject (Kelman, 1972). 

Giving subjects input regarding the purposes and goals of research, 
and procedures, reduces the discrepancy between the power of the subject 
and experimenter, and simultaneously can alleviate certain ethical 
problems (Kelman, 1972; Mead, 1969), particularly in terms of the costs/ 
benefits approach. First of all, potential subjects or their peers 
might be useful in pointing out the possible harmful effects of the 
research, in other words, in assessing costs. Secondly, input into goals 
affords the subject the opportunity to reap some of the benefits of the 
research. It may also make research intrinsically interesting for the 
subject, and possibly more relevant to his own life. This is particu- 
larly applicable in the case of action-oriented research (Chein, Cook S 
Harding, 19^8). Involving the subject in a way which benefits him, gives 
validity to the application of a costs/benefits analysis of a given piece 
of research. 



23-43 



Among the social scientists who have advocated increased subject 
involvement are Kelman (1972), Parsons (1969), Mead (19&9), Argyris 
(1968) and Wallwork (1975b). Granted, extending to subjects complete, 
or, perhaps even equal control over research would be impractical, if 
not impossible. Because of the investigator's specialized knowledge, 
he is far more competent in experimental design and methodology. In 
that area he must have the bulk of the power (Kelman, 1972). Argyris 
(1968) compares the relationship between subject and experimenter to that 
of employer and employee. Like employees, subjects do not want to take 
over, to run the whole project. They simply want greater influence 
and opportunity to participate in the planning. Actively involving sub- 
jects in research has methodological advantages as well. Subjects tend 
to be more cooperative if research is perceived to be relevant to their 
own lives (Argyris, 1968). (1975, pp. 68-70.) 

Developing New Methodologies 

In order to appropriately assess the cost/benefit criteria it is essential 
to identify worthy research objectives where investigators claim the use of 
deceptive practices is mandatory. I would suggest that the commissioners 
contract for at least one paper on this vital subject. However, the assumption 
that certain phenomena of interest cannot be investigated otherwise must be 
examined critically. In many cases where this claim is made deception may 
actually occur because investigators have come to rely on specific research 
designs based on deceit (as for example the Asch situation in the study 
of conformity) and because deception per se is viewed either as a prestigious 
methodological device or as a simple solution to research-design problems. 

Brief mention will be made of new methodologies being developed as a 
result of dissatisfaction with traditional experimental methods or in response 
to ethical problems. This is not the place to assess in detail their scien- 
tific merit although that question is relevant to a cost/benefit analysis. 

Role-playing has been suggested (e.g., Kelman, 1967) as a way of avoiding 
deceit. There is reason to believe that subjects frequently role-play naivete 
whether asked to do so or not. But the effect of actually asking them to do 
so may introduce a different artifact; subjects may be able to role-play the 
direction but not the magnitude of particular behavior. Where role-playing 

23-44 



has been used there is some evidence that subjects can simulate gross but 
not suble intervention effects in conformity experiments (e.g., Geller, 1970; 
Horowitz and Rothschild, 1970; Willis and Willis, 1970). Appealing as this 
"solution" is, there are good theoretical reasons to doubt the efficacy of 
role-playing as a substitute for the real thing. To the extent that a sub- 
ject does not know how he would react in a given situation he cannot role- 
play realistic behavior; were such information available investigators could 
merely use introspection rather than experimentation to determine reaction. 

Simulation, which is a special kind of role-playing may have greater 
potential as a substitute for deception. Perhaps the most famous experiment 
using simulation is Zimbardo's simulation of prison life (1973) where 2k vol- 
unteer subjects were randomly assigned to the role of jail guard or prisoner. 
The experiment was sufficiently successful in simulating loss of autonomy in 
"prisoners" and abuse of power in "guards" that it had to be terminated after 
6 days. 

Overt field research, using either structured situations or naturalistic 
observation followed by intensive interviews is, in my opinion, the method for 
avoiding deception and obtaining valid, representative, sound psychological 
data. Subjects can be fully aware they are being observed and even that the 
investigator may introduce stimuli intended to produce a range of scientifically 
interesting responses. While covert observation and staged occurences create 
serious ethical dilemmas, overt observation of representative behavior is pos- 
sible when subjects are given the opportunity to become accustomed to the 
observers, tape recorders and videotapes. My own research (Baumrind and Black, 
1967; Baumrind, 1971) relies heavily upon field research supplemented by inten- 
sive interviews which probe attitudes and values relevant to already observed 
behavior. These interviews also allow for examination of subjects' feelings and 



23-45 



attitudes about research, the setting, their own reactions, and the rela- 
tionships the investigator is studying. The fact that the interviewer has 
been a participant observer provides a shared focus of attention for the inter- 
view and decreases the liklihood of intentional misrepresentation or uninten- 
tional romancing by the subject. 
Enforcement of Regulations Governing Protection of Human Subjects 

Most social scientists have greeted the HEW guidelines with dismay and 
confusion. Investigators, with some justification, are complaining of har- 
assment by university committees composed largely of lawyers whose main con- 
cern is with neither the scientific enterprise nor the protection of subjects, 
but rather with the protection of the university from suit. By sensi ti tizing 
subjects to their rights and to the possibility of gain, these committees 
increase the probability that such suits will be brought. Both committees 
and investigators, for self-protective reasons, become overly cautious and 
concerned with following the letter rather than the spirit of the regulations 
and lose sight of the fact that the intention of the regulations is to protect 
the subject. "Major limitations upon scientific progress have been imposed by 
an overly restrictive interpretation of rules for the protection of human 
subjects. The integrity and privacy of the individual must be protected, but 

the procedures for insuring the welfare of the individual need not be so cum- 

2 
bersome and stultifying as currently practiced." 

Many investigators (including myself) feel that in operation University 

review committees fail to protect the welfare and rights of human subjects 

because that is not their primary aim. Their primary aim often appears to be 

protection of the institution from civil suit by subjects and/or harassment by 



2 

Wayne H. Holtzman, Chairman, Final report of the President's Biomedical 

Research Panel of the Social and Behavioral Development Interdisciplinary 
Cluster, October 1 , 1975. 



23-46 



HEW officials. Adherence to enlightened ethical principles and/or concern 
for the welfare of subjects are secondary concerns. As long as investigators 
and University committees experience themselves as in dire threat, which at 
present they do, these public servants will attend to their own survival needs 
first, and the welfare of the community second. 

The research enterprise is in fact threatened on many fronts. Funds for 
all research, but particularly social and behavioral research, have been 
sharply reduced. The respect scientists traditionally enjoyed in the community 
has been undermined. In my view, some of the most fruitful investigations 
and creative investigators are most threatened by loss of financial and 
social support. In social and behavioral science, the major important recent 
findings (as the President's Commission's final report of the Social and Behavi- 
oral Development Interdisciplinary Cluster concludes) are an outgrowth of longi- 
tudinal studies. But studying the same individuals repeatedly requires con- 
tinuity of support and dedicated application of research skills for several 
decades. Without advance commitment of long-range support, longitudinal 
programs cannot be effectively pursued. Creative researchers, motivated by 
dedication to knowledge and by personal autonomy, are the real victims of 
restrictive regulations and punitive sanctions. For it is they, more than 
those for whom research is merely a means of attaining material and social 
rewards, who suffer the loss of the intrinsic rewards of the research enter- 
prise itself. 

Perhaps the most serious and legitimate concern behavioral scientists 
have about pressure from Washington is that it is frequently political in form. 
Pressure may be from the right or the left. Many investigators believe that 
the liberal ideology of most social scientists has resulted in punitive action 
from a conservative administration in the form of reduced funding and general 



23-47 



harassment. Other investigators are more concerned about pressure from the 
left suppressing lines of investigation whose results may prove politically 
embarrassing. Genetic research is a particularly sensitive area because of 
its capacity for revealing differences between groups of people in socially 
valued attributes. For example, in Boston last year, public pressure forced 
cancellation of an investigation of an acknowledged reality, the "X Y Y syndrome"; 
there occurs in roughly one in 1000 males an extra sex chromosome, labeled 
Y, which limited statistical evidence suggests may be associated with anti- 
social behavior. A group known as Science for the People, aided by other 
activist critics, was able to exert sufficient political pressure to force the 
investigators (Walzer and Gerald) to truncate their research program. 

Similarly, a category called "social risk" was invoked by the Small Grants 
Section of the NIMH to effectively block normal peer review of two separate 
grants on the basis of "apparent failure to consider the probable social con- 
sequences of the study." One of the censored studies (Littman) proposed to 
study exploratory behavior in order to detect individual differences in social 
and intellectual competence among mildly retarded children. The second (Horn), 
proposed a secondary analysis of data on 62*» white children and 209 black child- 
ren in a study of "fluid" and "crystallized" intelligence. Here, two so-called 
ethical principles were invoked to censor his proposal --one, that consent for 
the secondary analysis had not been obtained from the original subjects, and the 
other, that a social risk to the class of which the subjects were members 
existed. I regard these two "ethical" rules and the use to which they were put 
as examples of harassment at best, serious violations of academic freedom at 
worst. Hopefully neither the principles governing protection of human subjects or 
the enforcement of these principles will continue to be used to create an atmos- 
phere of vigilantism and scruplulosi ty. I strongly recommend to the Commission that 



23-48 



it take steps to see that effects of its actions are positive and do 

not create a new self-perpetuating bureaucracy whose immediate victims are 

innovative scientists. 

Yet, as I documented at the beginning of this essay, social scientists 
have not given evidence that they can be trusted to regulate themselves to 
safeguard the rights and welfare of human subjects (witness the attached code 
of the American Sociological Association). Most do not believe that deceitful 
practices and failure to obtain informed consent constitute serious ethical 
violations. Most would probably agree with the sociologist Paul Reynolds (1972, 
p. 697) that among the examples of ethical problems cited by the APA "there is 
not a single instance of any individual suffering permanent damage as a result 
of participation in 'psychological' research;" and furthermore that (referring 
to APA guidelines) "it is difficult to justify such an elaborate set of prin- 
ciples to guide research." 

Moreover, the guidelines (APA and HEW) themselves facilitate abuse of the 
rights of human subjects by a) requiring informed consent and restricting the 
protection of the rights and welfare of subjects only to those likely to be at 
risk, and b) permitting risks to an individual provided it can be shown that 
those risks are outweighed either by the potential benefit to the individual 
or by the importance of the knowledge to be gained. 

Regretfully, I must conclude that effective external regulations and 
sanctions are necessary. How can they be made less onorous and more acceptable? 

I recommend that the same structure which imposes sanctions against un- 
ethical practices also assist investigators in accomplishing their legitimate 
objectives using ethical means. When an investigator encounters a problem he 
believes requires the use of deceit, manipulations or noninformed consent, he 
should be able to submit his predicament to a peer group for help in devising 



23-49 



more ethical procedures. If this is unsuccessful, an ombudsman should be 
available to represent the investigator's position to the ethics committee. 
A widespread educational campaign to inform the public about the social 
role and value of scientific enterprise as well as the ethical dilemmas sci- 
entists face in conducting their work should be mounted via the media. For 
example, citizens can be invited to respond to a graphic public opinion survey 
similar to the one successfully mounted by social policy planners responsible 
for the development of Yosemite National Park. More than ^0,000 Californians 
clamored to participate in the formation of a master plan for the park by com- 
pleting a very lengthy questionnaire composed of a very specific set of 
questions with action implications. A similar questionnaire-substitute for 
the town forum could be used nationally both to educate the public and to de- 
termine its present views on procedures and policy to be instituted for the 
protection of human subjects. 

The decision as to kinds of procedures to be prohibited, regardless of 
the potential benefit to society, belongs to the lay public. As yet we do 
not know whether the average citizen, if informed, would require informed con- 
sent and prohibit deceit regardless of potential social benefits. It is past 
time that we found out. 

Equally important, a widespread educational campaign aimed at the pro- 
fession, perhaps sponsored by the Commission, should be mounted to encourage 
discussion of the ethical issues which the Commission itself is considering. 
Few members of University committees for protection of human subjects are 
ethicists by training or interest. While departmental chairpersons and gradu- 
ate advisors are responsible for supervising students' research, few of them 
understand the ethical issues involved. The most serious ethical violations 
now occur in graduate students' research. These students are seldom offered 



23-50 



a course In ethics. I recommend that all persons involved in research on 
human subjects, or review of such research, be invited or required to attend 
seminars taught by ethicists who examine these issues. 

Perhaps the most effective pressure that could be put on investigators 
is the knowledge that editors would reject reports based on unethioal research 
where informed consent is not obtained or deceit is used. At present there is 
little evidence that editors or consulting editors include ethical considera- 
tions among their criteria for publication. 

The operation of institutional review panels must be improved. On the 
one hand, investigators must be protected from unwarranted interference with 
the efficiency of their operation and de facto censorship. On the other hand, 
the regulations themselves must effectively prohibit research activities that 
violate subject's rights or their welfare. 
Conclusion 

Perhaps the seminal problem in social and behavioral research is that 
not all investigators do in fact respect their subjects as persons or appreci- 
ate their contribution to the research endeavor. If respect could be assumed-- 
or if it could be taught as an integral part of the social scientist's pro- 
fessional educat ion--then neither the Commission for whom this report has been 
prepared nor the various ethical codes of the professional societies would 
be necessary. However, the very existence of ethical codes indicates that 
trust and respect have eroded to the extent that they have had to be replaced 
by formal contractual agreements, and even these are far from satisfactory. 

An examination of the Code of Ethics of the three major social science 
organizations reveals their established attitudes toward ethical regulation. 
The code of ethics of the American Sociological Association appears to me 
cynical and self-protective; the organization tries to defend itself from 



23-51 



external regulation and issues a declaration of professional independence. 
The American Psychological Association has produced a balanced, literate 
and profusely illustrated document which reflects but does not seriously 
attempt to resolve the fundamental differences that exist among psychologists. 
The body of the document provides varied and exquisitely detailed rational i- 
ations and procedures for violating the 10 clear-cut ethical principles ennuci- 
ated initially. The statement of the American Anthropological Assoication is 
an idealistic, ethically sensitive, and socially responsible document. It 
asserts affirmatively the absolute obligation of its members to place the 
interest of the subjects before their own and before those of science, and to 
use their scientific findings in the service of all humanity. However, it 
considers none of the intrinsic problems of anthropological research. (e.g., 
in obtaining informed consent does one abide by the code of an authoritarian 
tribal society which places no intrinsic value on the individual, or by a 
Western ethic that ostensibly does?) Nor does it explore the reasons why so 
many third world societies have rejected the attentions of anthropologists as 
intrusive and invasive. 

Alas, if only all men were of good will, the AAA code of ethics would 
suffice to remind us of our higher values and common humanity. But just as 
subjects' motives for participating in research range from the prudential to 
the principled, so do investigators'. Investigators' motivations may include 
the desire to dominate and control interpersonal situations. Unless sublimated 
appropriately, these motives can stimulate dehumanizing behavior toward sub- 
jects and rationalize that behavior in terms of scientific detachment and 
rigor. 

In view, then, of the social, political, and scientific realities of 
twentieth century America, it would seem that we have no choice but to 



23-52 



substitute some code or contractual agreement for the trust and respect which 
should, but can no longer be assumed to exist among human beings. All human 
activities are permissible in the proper time and place. There is a time for 
deceiving, as there is a time for hating and for killing. The question con- 
cerns time and place. It is the special characteristics of the research set- 
ting that put subjects "at risk" in ways they would not be in ordinary life. 
The use of deception in research, precluding--as it does— informed consent, 
cannot be justified today. The threat to privacy and to individual consti- 
tutional rights posed by computer technology and electronic surveillance devices 
in the hands of government and industry executives is too grave in contemporary 
American society to legitimize any justification of violations of constitu- 
tional rights. Social, behavioral, and medical scientists have not demon- 
strated that the legitimization of such violations in order to obtain informa- 
tion and knowledge would produce benefits that outweigh the costs to society. 
In fact, they have failed to demonstrate, as yet, that important scientific 
objectives are precluded by an absolute prohibition against deceitful practices. 
It is essential that they be given a chance to do so. If it could be demonstrated 
that in all probability a socially useful scientific objective could not be 
attained without the use of deceitful practices, I believe that, given an oppor- 
tunity to decide, most segments of the community would consent to the controlled 
use of deceptive practices to obtain that particular objective. I personally 
would not consent, nor would I intentionally use such practices even were the 
community to consent. But neither would I prohibit their use, because 
explicit collective consent by citizen groups substantially lowers the proba- 
bility that individuals will lose trust as a result of deceptive practices or 
that their right to liberty and self-determination would in fact be threatened 
by such practices. Also I am concerned that stringent external regulations 

23-53 



will drive many creative, intrinsically motivated scientists to abandon 
their research endeavors. 

However, proscriptions against social science methods which violate 
ethical principles may be exactly the impetus required to induce a paradigm 
shift in social psychology away from the study of subjects as objects to the 
study of subjects as active agents. There is evidence that psychologists 
on both sides of the Atlantic are already moving in that direction (Armistead, 
197*1; Smith, 197*0. To the extent that investigators treat subjects as though 
they are purposive, active, self-reflective persons trying to construct mean- 
ingful experiences for themselves within as well as without the investigator's 
small world, they may become so. New research designs better suited to under- 
standing men and women as active agents engaged with their social environment 
can be developed in response to the ethical and methodological limitations 
of our traditional methods. 






23-54 



APPENDIX A 

Examples of the Use of Deception Drawn Mostly from 
the Family Socialization Project (Baumrind) 



Here I will illustrate the kinds of ethical problems that have come to 
my attention in the last few years, all of them but the first drawn from my 
own research. 

1. An undergraduate student studying nonverbal communication. This 
fairly typical example illustrates rather well how little attention is paid 
to ethical issues by instructors in charge of training undergraduate and 
graduate students. In 197^ a competent undergraduate at the campus of the 
University of California wished to study nonverbal communication. She devised 
a gadget for recording instances of behavior which interested her; this 
gadget could be operated without the knowledge of the subject. She recruited 
student subjects on the pretext that she wished to interview them concerning 
their social and political attitudes and then recorded secretly their nonverbal 
reactions to the interview questions. It should be noted that her faculty 
sponsor did not raise the ethical issue with her. However, during the course 
of her study several friends questioned the ethics of her procedures, which 
led her to wonder how her subjects would feel if they discovered that they had 
been duped. Since it was a small campus, she was sure some subjects would make 
that discovery. The student, like many more mature investigators, felt that 
the real harm to the subjects would come from the debriefing itself. Therefore, 
she never debriefed her subjects. Was the scientific value of her study suf- 
ficient to justify the use of deceit and failure to debrief? She had never 
raised the question, nor had any of her instructors; I was the first to do so. 
The student learned to regard the use of deception as normative, and covert ob- 
servation as acceptable. The methodological requirements of her study did in fact 



23-55 



necessitate concealment. But there were ethical ways in which the concealment 
could have been accomplished. For example: 

a) Subjects could have been selected from amongst those who agreed 
to accept the instructions as given, with the understanding that, as is true in 
many experimental situations, the entire truth might be withheld and they would 
receive a full explanation of the objectives of the investigation subsequent 
to the study. S's given such instructions easily suspend disbelief since what 
they are suspicious about has been admitted candidly from the beginning. 

or b) Nonverbal cues could have been recorded inconjunct ion wi th 
an actual social survey conducted for bonafide purposes by another investigator. 
Debriefing would include acknowledgment that additional information had been 
collected. Consent after the fact would be obtained from all subjects who were 
retained in the unlikely event that any S objected to having such complete 
data about his or her behavior collected. 

2. A graduate student using a modified Prisoner's Dilemma Game in my 

research, project. A second example is described by a graduate student whose 

dissertation I helped to supervise. His account is as follows: 

I encountered an ethical problem in my doctoral research when I 
decided to use a modified Prisoner's Dilemma Game. The game was 
played by two subjects at a time at computer terminals. The sub- 
jects were nine-year-old children. In order to establish a base- 
line for each child's level of cooperation, I planned to present 
each subject with a standard sequence of plays stored in the com- 
puter. Then, to measure the children's interactive play, I planned 
to present each child with his or her partner's actual choice. My 
initial plan was to deceive the children by telling them that they 
would always be playing with their real partner. Dr. Baumrind, one 
of my dissertation advisors, refused to go along with this on the 
grounds that to falsify the children's perceptions of their social 
interactions was wrong. We cast about for a solution that would 
preserve the experimental design and that would also be free from 
deception. The solution which was actually applied was to inform 
the children that as they played, part of the time their partners 
would be real (i.e. human) and part of the time the computer would 
be their partner. I added that, since they would not know when 
they were playing with the computer and when they were playing with 



23-56 



their real partner, they should play as if they were playing with 
their real partner. Thus, although the children were left in doubt 
until the end of the experiment as to who their partner was, they 
were not deceived. In fact, for the first 125 of the total 200 tries, 
subjects were playing against a computer. 

Questioning after the game indicated that all of the children understood 
the actual situation. The children's comments and the data that 
emerged from this experiment were consistent with those of colleagues 
who used deceptive instructions; so that in this instance deceitful 
instructions appear to have been unnecessary to accomplish my exper- 
imental objectives. The information obtained during the 75 trials 
when the child was engaged in truly interactive play yielded information 
interesting in its own right. 

My concern here was that if the children were deceived from the beginning 

the experimenter would have an untenable choice during the debriefing process — 

either he would have to tell the children they had been deceived in the first 

place, thus positively sanctioning the practice of deception by an adult 

authority, or he would have to forego debriefing altogether in which case the 

children would leave the experimental situation misinformed concerning the 

extent to which their peers used cooperative or competitive strategies. My 

judgment was that in either case the child's own ethical judgment would be 

affected adversely, and that the risk no matter how small could not be justified 

by any gain in knowledge accruing from the experiment to the subject. 

3. My own research — observing children in the school setting. We routinely 
collect information on each child in the school setting. The information we 
collect might be more representative if the children did not know they were being 
observed. However, for ethical reasons our practice is to obtain the children's 
explicit permission to make school visits although we have already obtained 
full consent from their parents. Therefore, we interview the child "prior to 
the first school visit, and at the end of the interview obtain his permission 
to make a series of school visits. But incidental to our observations of the 
subject child, we do take notes on other children who are part of his environ- 
ment. These children not in our study on whom information is collected 

23-57 



incidental ly are not told that they are being observed. It is our judgment 
that to do so would burden the students. The child might then become self- 
consciously concerned that any visitor in the room was observing him, even 
those he did not know or with whom he had not established a relationship. 
Since our choice was to distress the children by asking their permission, to 
not make school visits at all, or to fail to make full disclosure, we chose 
the latter alternative. Since the children are in no way distressed or 
deceived by our presence alone, we regard our "failure to make full disclosure" 
as acceptable although not exemplary, and continue to observe the child subjects 
in their school settings. 

4. My own research — active withholding of information. In order to 
protect a sensitive and self-conscious child, we have had occasion to withhold 
the whole truth from other children who asked about our purposes in the class- 
room. The partial truth we tell them is that we are there to learn more about 
how classrooms differ. This partial truth is intended to deceive and is, 
therefore, a lie. When one child thanked the observer for lying in order to 
save him embarrassment, we acknowledged that we had done so for that purpose. 
We do not regard telling a "white lie" (i.e., a lie intended to prevent dis- 
comfort) to a child as setting a bad example. Since we had created a situation 
in which the child was placed "at risk", we felt we had the responsibility for 
minimizing that risk although to do so involved deceit. The implicit contract 
with the subject-child is that we will relate to him or her in a supportive 
and partisan fashion and that is what we do. It was our judgment that the 
telling of a partial truth to the other children- did not place them at risk, 
because it is understood by school children that adult strangers need not take 
them into their confidence by revealing their full intentions. 

5. My own research — lying thoughtlessly and unnecessarily. On occasion 
in our research, we find ourselves lying unthinkingly and for no good reason. 

23-58 



For example, one of Flavell's role-taking tasks is administered as follows — 
and we followed the standard instructions until we thought more about it. 
E. displays a series of seven pictures and asks S to tell the story which they 
illustrate. Three specific pictures are then removed, E2 enters the room, and 
S is requested to predict the story which E2 would probably tell from the 
remaining four pictures. S is told that E2 has never seen the whole series of 
seven pictures. This is of course a lie and an unnecessary one at that. 
Incidentally, it is not believed by most bright children. (In fact, a child 
brought to our attention that we were lying by saying "Aw, come on, how 
long hasfE^jbeen working here?!") It is sufficient for our purposes to instruct 
S to predict the story E2 would probably tell if he had never seen the remain- 
ing four pictures. The added advantage of this procedure is that the set for 
all S's is standardized. 

6. My own research — lying by implication. It is our practice to film 
a family discussion situation. The family is told explicitly that they will 
be filmed. However, prior to the full family discussion in which both parents 
and the subject discuss the Kohlberg moral judgment stories, the parents have 
fifteen minutes together in which they plan their approach. For months we 
overlooked telling the parents that this portion of the interaction was being 
filmed. Since we had been so honest with them about our procedures and intentions, 
they assumed that we would have told them if they were being filmed. It happens 
that one of the videotapers was aware that the parents were being misled into 
thinking that they were not being filmed. He felt that the Information thus 
obtained was particularly valuable because it appeared so informal — the family 
is in a living room setting and no observers are present — and that to tell them 
they were being filmed would reduce the informality. He did not feel that by 

23-59 



saying nothing he was lying. Once I became aware of the situation, parents 
were from then on informed. 

I present these rather trivial but typical instances of the use of deception 
to illustrate how ubiquitous the use of intentional and non-intentional deceit 
is even when the investigator is sensitive to ethical issues, and also to 
suggest that in most instances deceptive practices can be eliminated and the 
objectives of the research nonetheless achieved. I also wish to illustrate that 
it is not deception in a vacuum which is ethically unacceptable; it is the 
violation of the subject's basic rights, particularly the right of self- 
determination, which so often occurs with the use of deceptive practices that 
cannot be accepted on ethical grounds. 
Postscript 

In my final re-rereading of this paper I note that despite my objections 
to the implications of the term "subjects'' I continue to use that term to refer 
to participants. This is not only logically inconsistent and revealing, but 
has the effect of reinforcing a public attitude towards participants which I 
contend should be changed. In the event that this paper is published by the 
Commission I request, therefore, that the word "subject" be changed to 
"participant. " 



23-60 



APPENDIX B 

Procedures for Obtaining Informed Consent Used by 
the Family Socialization Project (Baumrind) 



Participation is solicited by telephone from prospective subjects. Those 
who are willing to explore further the possiblity of participation are sent 
a lengthy summary of procedures in which they would be expected to participate 
were they to consent. This is followed up by a visit to their home in which 
all family members are present. At that visit the procedures are explained 
further to the parents, and those that affect them are discussed with the 
children. There are three separate consent forms, all appended. One consent 
form, to be signed by both parents, signifies agreement with the procedures 
described. The second consent form is in the form of a letter to the child's 
principal and teacher requesting their cooperation. Willingness to sign this 
letter signifies a high level of commitment to the project. There is a third 
consent form for use of case history material. 

Note that the child's written consent is not obtained at this age ( ages 
8 or 9) and that none of the consent forms specify possible benefits or costs. 
These considerations are discussed during the home visit. The following 
description of costs and benefits is provided the University Review Group. 

Effects on Subjects 

Beneficial. The relationship with subjects is collaborative. In 
addition to the information about family processes which subjects provide, 
their critical abilities are harnessed to our own in revising measures. 
Parents are given copies of the self-report and other measures in order 
that they may continue to explore in their own minds the child rearing and 
value issues which these measures assess. Lengthy conferences are arranged 

23-61 



with each family to provide feedback. In addition, an honorarium of $150.00 
is given to each family. 

Potential drawbacks. 

1. Invasion of privacy. Our procedures include invasion of the privacy 
of the homes. Protection of subjects is afforded by selection of observers 
who are courteous, supportive, tactful, and professional in their demeanor. 

In order to assure confidentiality, data are converted to IBM cards and data 
tape. In this form the subjects are fully anonymous. 

2. Deceit. We avoid the use of procedures which require deceit or 
covert observation, even where to use these procedures would provide us with 
more valid data. For example, observers are instructed to interview the 
child prior to school visits and to obtain the child's consent to these visits, 
even though observation would be more "naturalistic" if the child were not 
aware that he was being observed. While we tape behind a one-way mirror, 

all family members take their turn observing behind the mirror so as to assure 
their full awareness as to what can be seen and heard. 

3. Unanticipated self-knowledge . The intensive interviews concerning 
moral judgment and child rearing practices will initiate in some parents a 
re-examination of their own values. For a few this self-examination may 
initiate insights and changes which would be facilitated by discussion with a 
psychologist. We have a person on our staff to perform this function. 

Benefits To The Lay And Sci e ntific Community 

Characteristics of this program of research which contribute to its scientific 
significance are a) in-depth collection of data using multiple settings and 
measures over time; b) the use of an extensive battery of objectively scored 
tests and videotape transcripts to supplement the ratings; c) the longitudinal 
nature of the data collected; d) the fact that the sample studied is from the 

23-62 



San Francisco Bay Area, an area of the country where secular changes are 
first felt so that the relationships noted should have predictive significance 
for the rest of the country; e) dimensions of child-rearing and of child 
behavior are studied configurally rather than in isolation thus permitting 
important distinctions between parents and between children to emerge. 

The focus of the program of research is on patterns of parental authority, 
an area of acknowledged social importance, particularly today. The way in 
which authority has been conceived and exercised has been one of mankind's 
constant concernes through the ages and assumes particular interest in a 
period of rapid social change. 



23-63 



UNIVERSITY OF CALIFORNIA, BERKELEY 

7|// IJ ffcM\ 

BERKELEY • DAVIS • IRVINE • LOS ANCELES • RIVERSIDE • SAN DIECO • SAN FRANCISCO Ktt :; "l f*SNSJ SANTA BARUARA • SANTA CRUZ 



EDWARD CHACE TOLMAN IIAI.L 
INSTITUTE OF HUMAN DEVELOPMENT BERKELEY, CALIFORNIA 94720 

Family Socialization and Developmental area code 415 <>42-.if><u 

Competence Project 

PERMISSION FORM 



Name of child 

(please print) 

I have read the SUMMARY OF PROCEDURES which you provided. Our family is willing to 
participate in this phase of the study, and you have our permission to perform the procedures 
listed. You also have our permission to videotape some of the procedures for use by members 
of the research staff only. I understand that no use other than coding and analysis of data 
by the staff will be made without my specific, written consent. 



Signature of mother Name 



(please print) 
*Social Security No. 
Signature of father Name 



(please print) 
^Social Security No. 



Address 



Ph 



one 



*For payment of Honorarium. 






23-64 

A National Institute of Child Health ,uu\ Development prnjert lor the study of thi Id-rearing attitudes and pruetiet-s. 



UNIVERSITY OF CALIFORNIA, RERKELEY 

BERKELEY • DAVIS • IRVINE • LOS ANGELES • RIVERSIDE • SAN DIECO • SAN FRANCISCO il'WIfPllSBSSli SANTA BARBARA • SANTA CRUZ 




Institute of Human Development 1203 Edward Chace Tolman Hall 

Family Socialization and Developmental Berkeley. California 94720 

Competence Project 



TO: My Child's Principal and Teacher 



Our family is participating in the Family Socialization and Developmental 
Competence Project (a study affiliated with the Institute of Human Development, 
University of California, Berkeley, directed by Diana Baumrind, Ph. D. ). The 
study is concerned with the childrearing attitudes and behavior of parents and 
their effects on the development of their children. 

One of the project's interests is to evaluate how the child adapts to his/her 
particular school environment. This would involve an observer paying about 
three visits to our child's school and observing classroom and playground 
situations in which our child participates. We have already granted the project 
our permission to do so, if this meets with your approval. If you have any 
questions as to what this will involve, please call Dr. Baumrind' s office, 642-3603, 
from 9 to 5 on week days, and an observer will be glad to answer your questions. 

Sincerely yours, 



Signature of Parent 



Name (Please Print) 



Child's Name (Please Print) Address 

School Grade 



23-65 



UNIVERSITY OF CALIFORNIA, BERKELEY 

BERKELEY • DAVIS • IRVINE • LOS ANGELES • RIVERSIDE • SAN DIEGO • SAN FRANCISCO ^t^'-'f^^sS")- SANTA BARBARA • SANTA CBUZ 




FAMILY SOCIALIZATION AND DEVELOPMENTAL COMPETENCE PROJECT 1203 EDWARD CHACE TOLMAN HALL 

INSTITUTE OF HUMAN DEVELOPMENT BERKELEY, CALIFORNIA 94720 

AREA CODE 415 642-3603 



To: Participant families in The Family Socialization and Developmental Competence Proj t 

(Formerly The Parental Authority Research Project) 

From: Diana Baumrind, Ph. D. 
Principal Investigator 

Re-. Consent for using case history material 

Up to this point, data on families have been converted to quantitative scores that yield 
generalizations similar to the following: "The use of mild forms of corporal punishment 
is not associated with any symptoms of maladjustment in the families studied" or "When 
parents require the child to participate in household chores, the child is relatively 
independent and self-assertive." When findings are reported in such a manner, reference 
to individual cases is unnecessary. Because of the richness of the data and their 
longitudinal nature, we are finding that we would like to be able to use the data in 
additional ways. In order to do so, we believe that we should first obtain consent from 
each family for each possibility. So that we will know what "case history" data are available 
to us to use, we are asking that you check the following and return the sheet to us. IF 
EITHER PARENT OBJECTS, PLEASE RESPOND IN THE NEGATIVE FOR THAT QUESTION. 

The videotapes of the family discussion illustrate the ways in which people communicate in 
arriving at decisions. 

1. May we show excerpts from the videotape of your family discussion 
(or parent teaching) to seminars of graduate students or meetings of 
professional colleagues to illustrate ways in which different families 
arrive at decisions? 

Yes No 

2. In articles prepared for a professional audience, may we include brief 
descriptions of the content of the tapes to illustrate family processes? 
Names and similar identifying data are omitted. 

Yes No 



In order to enrich our discussion of findings in journal articles, we would like to be able 
to include brief excerpts from interviews or home visits. In all cases identifying data are 
omitted. 

3. May we use such excerpts from our notes of the home visits or transcripts 
from the interviews with your family? 

Yes No 



NAME OF FAMILY SIGNATURE DATE 

23-66 



References 

Ad hoc Committee on Ethical Standards in Psychological Research. Ethical 

principles in the conduct of research with human participants . Washington, 

D.C.: American Psychological Association, 1973- 
Argyris, C. Some unintended consequences of rigorous research. Psychological 

Bulletin , 1963, 70. '85-197- 
Armistead, N. (Ed.). Reconstructing social psychology . Great Britain: 

Penquin Education, 197^- 
Aronson, E. , & Carlsmith, J. M. Experimentation in social psychology. In G. 

Lindzey and E. Aronson (Eds.), Handbook of Social Psychology , 2^, (2nd ed.) 

Cambridge, Mass.: Addison Wesley, 1 968 . 
Brandt, L. W. Science, fallacies, and ethics. The Canadian Psychologist , 1971, 

U (2), 231-242. 
Brody, P. R. Manipulabi 1 i ty of predisposition to choose immediate or delayed 

gratification . Unpublished manuscript, Yale University, 19&7- 
Baumrind, D. , & Black, A. E. Socialization practices associated with dimensions of 

competence in preschool boys and girls. Chi 1 d Development , 1967, 38_ (2), 291-327- 
Baumrind, D. Current patterns of parental authority. Developmental Psychology 

Monograph , 1971, 4_ (1, Part 2). 
Baumrind, D. Metaethical and normative considerations. In E. C. Kennedy (Ed.), 

Human rights and psychological research . New York: Thomas Y. Crowel 1 , 1 975 • 
Chein, I. The science of behavior and the image of man . New York: Basic Books, 

1972. 
Chein, I., Cook, S. W. , £ Harding, J. The field of action research. American 

Psychologist , 1948, 3_, 43-50. 
DHEW, Food and Drug Administration: Drugs for human use: Reorganization and 

republication. Federal Register , March 29, 1 97^ , 39 (62), 11684-11685 

and 11712-11718. 

23-67 



Eisner, M. Ethical problems in the use of deception in social psychological 
experimentation in the laboratory . Unpublished Master's thesis, York 
University, Toronto, Ontario, 1975. 

* Fi 1 lenbaurn, S. Prior deception and subsequent experimental performance: The 

"faithful" subject. Journal of Personality and Social Psychology , 1966, k_, 
532-537. 

* Fine, R. H., & Lindskold, S. Subject's experimental history and subject-based 

artifact. Proceedings of the Annual Convention of the American Psychological 

Association , 1971, 6_, 289-290. 
Fletcher, J. Situation ethics: The new morality . Philadelphia: Westminster, 

1966. 
Ge 1 1 e r , S . H . A test of role playing as an alternative to deception in a 

conformity experiment . Unpublished Master's thesis, York University, Toronto, 

Ontario, 1970. 
Guttentag, M. Models and methods in evaluation research. Journal of the Theory 

of Social Behavior , 1971, 1 (1), 75~95- 
Harre, R. , & Secord, P. The explanation of social behavior . Oxford: Basil 

Blackwell, 1972. 
Holland, C. H. Sources of variance in the experimental investigation of 

behavioral obedience . Doctoral dissertation, University of Connecticut, 

Ann Arbor, Michigan: University Microfilms, 1968. No. 69-21^6. 
Horowitz, I. A., & Rothschild, B. H. Conformity as a function of deception and 

role playing. Journal of Personality and Social Psychology , 1970, _U, 22^-226. 
Jung, J. Snoopology. Human Behavior , 1975, k_ (10), 56-59. 

* Keisner, R. Debriefing and responsiveness to overt experimenter expectancy 

cues . Unpublished manuscript, Long Island University, 1971. 

* Kelman, H. C. Deception in social research. Trans-Action , 1966, 3_, 20-24. 
Kelman, H. C. The human use of human subjects: The problem of deception in 

social psychological experiments. Psychological Bulletin , 1967, 6_7, 1-11. 

23-68 



* Mead, M. Research with human beings: A model derived from anthropological 

field practice. Daedalus , 1969, 2_. 
Menges, R. J. Openness and honesty versus coercion and deception in psycho- 
logical research. American Psychologist , 1973, 28_, 1030-103'*. 
Milgram, S. Behavioral study of obedience. Journal of Abnormal and Social 

Psychology , 1963, 67, 371-378. 
Milgram, S. Obedience to authority: An experimental view . New York: Harper 

& Row, 1974. 
Miller, A. R. The assault on privacy . Ann Arbor, Mich.: The University of 

Michigan Press, 1971. 
Mills, J. A procedure for explaining experiments involving deception. Personal i ty 

and Social Psychology Bulletin , in press. 
Mishkin, B. The expanding concept of injury: Protecting the psychological 

sel f . Paper presented at the American Psychological Association Convention, 

1975. 
Mixon, D. Instead of deception. Journal of the Theory of Social Behavior , 1972, 

2 (2), 145-177. 
Mixon, D. If you won't deceive, what can you do? In N. Armistead (Ed.), 

Reconstructing social psychology . Great Britain: Penquin Education, 1974. 
Nash, M. M. Nonreactive methods and the law. American Psychologist , 1975, 3£ 

(7), 777-780. 
Orne, M. T. On the social psychology of the psychological experiment: With 

particular reference to demand characteristics and their implications. 

American Psychologist , 1962, j_7, 776-783. 

* Parsons, T. Research with human subjects and the "professional complex". 

Daedalus , 1969, 98, 325-360. 
Reynolds, P. D. On the protection of human subjects and social science. 
International Social Science Journal , 1972, 24^ (4). 

23-69 



Ring, K. , Wallston, K. , & Corey, M. Mode of debriefing as a factor affecting 
subjective reaction to a Milgram-type obedience experiment: An ethical 
inquiry. Representative Research in Social Psychology , 1970, ]_, 67-88. 

Schultz, D. P. The human subject in psychological research. Psychological 
Bulletin , 1969, 72, 214-228. 

Seeman , J. Deception in psychological research. American Psycholgist , 
1969, 2k, 1025-1028. 

Silverman, I. Nonreactive methods and the law. American Psychologist , 1975, 
30, 764-769- 

* Silverman, I., Shulman, A. D. , & Wiesenthal , D. L. Effects of deceiving and 

debriefing psychological subjects on performance in later experiments. 

Journal of Personality and Social Psychology , 1970, ]k_, 203-212. 
Smith, M. B. Humanizing Social Psychology . San Francisco: Jossey-Bass 

Publishers, 1974. 
Strieker, L. The true deceiver. Psychological Bulletin , 1967, 6_8, 1 3~ 20 . 
Sullivan, D. S., 6 Deiker, T. E. Subject-experimenter perceptions of ethical 

issues in human research. American Psychologist , 1973, 28_, 587-591- 
Wahl , J. M. The utility of deception: An empirical analysis . Unpublished 

manuscript prepared for Symposium on Ethical Issues in the Experimental 

Manipulation of Human Beings, Western Psychological Association, Portland, 

Oregon: April 27, 1972. 
Wallwork, E. Ethical issues in research involving human subjects. In E. C. 

Kennedy (Ed.), Human rights and psychological research . New York: Thomas 

Y. Crowell , 1975. 

* Wallwork, E. In defense of substantive rights: A reply to Baumrind. In E. C. 

Kennedy (Ed.), Human rights and psychological research . New York: Thomas 
Y. Crowell, 1975. 



23-70 



Westin, A. F., & Baker, M. A. Databanks in a free society . New York: 

Quadrangle Books, 1972. 
Willis, R. , 6 Willis, Y. Role playing versus deception: An experimental 

comparison. Journal of Personality and Social Psychology , 1970, J_6, kfl-kTl . 
Zimbardo, P. G. On the ethics of intervention in human psychological research: 

With special reference to the Stanford prison experiment. Cogni t ion , 1973, 

2, 2^3-256. 
Zimbardo, P. G., Banks, W. C, Haney C, 6 Jaffe, D. The mind is a formidable 

jailor. New York Times Magazine , May 20, 1973, 123- 

* Reference cited is contained within quoted material. 



23-71 



24 

SOME COMPLEXITIES AND UNCERTAINTIES REGARDING THE 
ETHICALITY OF DECEPTION IN RESEARCH 
WITH HUMAN SUBJECTS 

Leonard Berkowitz, Ph.D. 



Some Complexities and Uncertainties Regarding the Ethical ity 
of Deception in Research with Human Subjects 

Leonard Berkowitz 
University of Wisconsin 

One of the most complex methodological problems confronting the sciences 
engaged in research with humans has to do with the ethical ity of deception. 
Yet recent cultural trends have made us more conscious of this problem than 
ever before and press for a relatively quick solution. There has been a re- 
newed emphasis on the value of human dignity and the right of the individual 
to be free from arbitrary coercion in the past several years. Perhaps more 
than at any time since the Great Depression we as a people insist on the 
desirability of individual autonomy. Revenue sharing, suspicion of big 
government, the mounting distrust of politicians, and the spreading popu- 
larity of the economic notion that "small is beautiful," among other things, 
testify to the growing belief that a person should have more control over 
what happens to him. 

I do not mean to question these ideas or argue that the contemporary 
interest in them is only a passing fad. However, it is all too apparent 
that the current concern with individual dignity and autonomy has led some 
people to be highly critical of behavioral science and especially of labora- 
tory experimentation with humans. In their estimates of the possible costs 
and benefits of this research these critics tend to give relatively little 
weight to the favorable consequences that might result. At the same time, 
they stress and perhaps even exaggerate the risks to the research subjects. 



24-1 



From their perspective it is necessary to establish firm guidelines, if not 
restrictive rules, for behavioral science investigators in order to protect 
the rights of the subjects and minimize the injuries that might be done to 
them. I will try to argue in this essay that it is virtually impossible to 
set up a screening agency that will assess the relative costs and benefits 
of a given experiment with any substantial degree of validity, and also, 
that attempts to create a board of monitors which will closely scrutinize 
all research for every conceivable threat to subjects will seriously impede 
the development of behavioral science. In a sense, this paper is a brief 
in support of experimental behavioral science. It would permit the prac- 
tices that most laboratory-oriented behavioral scientists follow today, in- 
cluding reasonable deceptions. Since other writers speaking to the Commis- 
sion will emphasize the risks and ethical difficulties inherent in the use 
of ruses, partial truths, and downright misleading statements in experimental 
research, my own argument, brief -like, will downplay these costs. As a con- 
sequence, it may seem that I do not believe there are any problems or dangers 
in stressing and deceiving research participants. This is not the case. I 
do feel, however, that some of the objections to the investigations being 
carried out by contemporary experimental behavioral scientists are exagerated. 

Several prominent social psychologists have voiced misgivings about the 
widespread use of deceptions in laboratory studies. To sample just a few of 
these objections, a generation ago Edgar Vinacke (1954) expressed concern 
about experiments in which "the psychologist conceals the true purpose and 
conditions of the experiment, or positively misinforms the subjects, or ex- 
poses them to painful, embarrassing, or worse, experiences, without the sub- 



24-2 



ject's knowledge of what is going on." He wondered what "is the proper 
balance between the interests of science and the thoughtful treatment of 
the persons who, innocently, supply the data?" Vinacke seemed to imply 
that social psychologists all too frequently treated their research sub- 
jects in a non-thoughtful, perhaps even inhumane, fashion in their pursuit 
of their scientific objectives. Some years later Herbert Kelman (1967) 
raised the problem anew in a thoughtful and fairly moderate critique of the 
"unquestioning acceptance" of the "routinization of deception" that exists 
in experimental social psychology. Kelman recognized the necessity of mis- 
leading subjects about the true nature of the research in some types of 
studies. "There are many significant problems that probably cannot be in- 
vestigated without the use of deception," he noted. However, he wondered 
whether social psychologists have the right "to add to life's little anxie- 
ties and to risk the possibility of more extensive anxiety purely for the 
purposes of our experiments." The explanation (debriefing) typically given 
to the subjects at the end of each laboratory session might not, he thought, 
adequately remove all of the harmful effects. Mil gram's well-known experi- 
ments on obedience to authority are an excellent case in point. Even though 
the obedient subjects were told afterwards that they had only been in an 
experiment and had not actually shocked anyone, Kelman argued, "there is 
good reason to believe that at least some of the obedient subjects came 
away from this experience with a lower self-esteem, having to live with the 
realization that they were willing to yield to destructive authority to the 
point of inflicting extreme pain on a fellow human being." Even if the 
experience provided these people with an opportunity to learn something of 



24-3 



importance about themselves, as Mil gram maintained, do we (Kelman asked) 
"have the right to provide such potentially disturbing insights to subjects 
who do not know this is what they are coming for?" The same thing can ob- 
viously be said about much less stressful research such as conformity ex- 
periments. Is it proper for the investigator to affect his subjects' self- 
esteem by showing them that they were easily swayed by the fictitious group 
pressure? 

Kelman 's question about the ethical ity of deception research rested 
largely on the possibility of long-lasting adverse consequences. The sub- 
jects might suffer a fairly persistent injury to their self-concepts or ex- 
perience a continuing anxiety that is not remedied by the experimenter's 
debriefing at the conclusion of the session. But in addition, he also won- 
dered if the lies and tricks used in social psychological experiments did not 
also color the subjects' views of the world around them. They learned that 
they had been manipulated by the experimenter's deceptions and this lesson 
could reinforce other demonstrations, all too prevalent today, that man is 
an object to be manipulated at will by societal institutions. "In institu- 
tionalizing the use of deception in psychological experiments," Kelman con- 
tended, "we are, then, contributing to a historical trend that threatens 
values most of us cherish." The Ad Hoc Committee on Ethical Standards esta- 
blished by the American Psychological Association (the Cook Committee) sum- 
marized this type of argument in these words: 

"One frequently hears it asserted that behavioral research 
is contributing directly to the moral ills of society. 
According to this argument, when an investigator invades 
the privacy of another person, employs deceit, or occasions 
pain or stress, he contributes to legitimizing these indig- 
nities, and therefore to their prevalence in interpersonal 
behavior." (Cook et al . , 1973, p. 17). 

24-4 



This accusation is a very serious one, especially given the prevalence 
of deception in personality and social psychological research. A good many 
studies in these areas attempt to mislead subjects about important aspects of 
the investigation they are in. One count of 390 published reports in per- 
sonality and social psychology (Strieker, 1968, cited in Silverman et a!., 
1970) found that participants were "intentionally misled" in 21% of the stu- 
dies. This is probably a minimum estimate of the frequency with which sub- 
jects are deceived. As Aronson and Carl smith pointed out ( , p. 30), 
mild deceptions can be very subtle and common — such as misinforming people 
about the true purpose of a personality test they are taking (for example, 
by introducing the TAT as a test of creativity) or behaving in a pseudo- 
friendly manner to the subjects in order to make them more cooperative. Can 
these widespread practices be defended? Should the researchers employing 
these procedures be subjected to stringent controls established by some out- 
side agency? 

I would like to start this defense of the judicious use of deceptions in 
behavioral experiments by taking up the last two concerns I mentioned: first, 
whether subjects generalize from the experimental situation to other condi- 
tions of life and then, second, whether the experimenter's debriefing can 
alleviate many of the ill effects of the experiment. The discussion will 
then turn more directly to the matter of informed consent and will consider 
the kinds of information that should be given to the prospective participants 
in soliciting their cooperation. 

Kelman believed that social psychologists are shortsighted when they dif- 
ferentiate between the laboratory and the surrounding world. "We tend to re- 

24-5 



gard the [laboratory setting] as a situation that is not quite real, that can 
be isolated from the rest of life like a play performed on the stage, and to 
which, therefore, the usual criteria for ethical interpersonal conduct become 
irrelevant" (Kelman, 1967, p. 5). Kelman is quite right in one sense; social 
psychologists are inconsistent if they view the laboratory situation as "not 
quite real" and still extrapolate their findings to other social settings. 
He is incorrect, nonetheless, in thinking that investigators defend their 
practices on the grounds that the laboratory "can be isolated from the rest 
of life." The laboratory does not really have its own rules of conduct. Most 
subjects believe that an experimenter's actions are governed by overriding 
standards, general rules that an investigator is expected to follow, much as 
everyone else also follows rules. Thus, according to evidence gathered by 
Epstein, Suedfeld and Silverstein (1973), research participants typically 
feel that an experimenter is primarily obligated to provide clear instructions, 
insure the subjects' safety and warn them of danger. He apparently is not 
expected to be truthful in every detail. It may well be, as many researchers 
believe, that subjects do regard the experimenter's statements to them as 
morally appropriate. The rules of his scientific enterprise, which they 
generally recognize, permit him to mislead them, and he is keeping to these 
rules. For many of them, the larger context within which the study is car- 
ried out serves to justify the deceptions, partial truths and stresses to 
which they had been exposed. 

While I agree with the Cook Committee (1973, p. 17) that further research 
is needed to determine what standards should govern the experimental proce- 
dures, my own experience over more than two decades of laborary investigations 



24-6 



with university students is entirely consistent with the statement I have 
just made. Many of my experiments in recent years have deliberately pro- 
voked subjects so that we could study the conditions influencing their ag- 
gressive responses. Other social psychologists have conducted similar 
investigations. But despite all of the frustrations and insults adminis- 
tered to thousands of subjects, I have not heard of any complaints about 
these treatments being voiced to university authorities at Wisconsin or 
elsewhere. There certainly have not been any protests sent in to our 
fairly radical student newspaper about this type of research. Of course, 
a few students might have resented the treatment they received, but I sus- 
pect this was quite infrequent, perhaps surprisingly so from the point of 
view of some critics, and even then was very mild. There are good reasons 
for this, some having to do with the debriefing--and I will go into this 
shortly—and others with the perceived legitimacy of the experimental treat- 
ments. When the subjects learned at the end of the session what had been 
done to them and why, the great majority undoubtedly readily grasped the 
significance of the research. They also regarded the experimenter's behavior 
as justified within the context of his scientific activity. The provocation 
had not been directed against them personally, they realized, but was in keep- 
ing with the implicit rules of a social -psychological experiment, and was 
therefore "de-emotionalized." My firm belief is that for the preponderance 
of university students the scientific context of the experiment similarly 
"de-emotionalizes" many different kinds of stress that they might have ex- 
perienced in the course of the study. 



24-7 



The subjects understand this scientific justification when they finish 
participating in the study and the researcher explains his purposes and 
methods. The debriefing places the experimental experience in the appro- 
priate context. Contemporary theoretical analyses of emotions as well as 
several recent investigations of the consequences of debriefings suggest 
that these after-the-fact explanations can do much to lessen the unpleasant- 
ness of whatever stresses and strains have been imposed on the subjects. 

These results are not particularly surprising. But I think they parallel 
what often happens in some kinds of psychological experiments when the experi- 
menter debriefs the research participants. Here too, the subjects are pro- 
vided with an explanation that changes the meaning of the threat to which 
they had been exposed in the investigation. They now learn that they had not 
really been confronted by a test of how well adjusted they are or an assess- 
ment of their personal adequacy or a deliberate insult to their self-esteem. 
Perhaps equally important, they find that what had seemed like an arbitrary 
assault directed at them personally was actually an impersonal treatment ad- 
ministered to all of the people in their experimental condition. The event 
that had previously aroused anxiety or anger is now viewed in a very different 
manner, is "de-emotionalized" as I said before, and the subjects' emotions 
subside fairly quickly. 

The debriefing can also cause the subjects to reinterpret their own be- 
havior in the experiment. Earlier in this paper I quoted an argument that 
Milgram had employed to defend his research on obedience to authority: his 
participants had learned something about themselves--they had a tendency to 



24-8 



submit to authority. However, as Kelman (1967) noted, the subjects might 
not have wanted this kind of insight. In the same vein, Baumrind (1964) 
pointed out that the subjects could have suffered a blow to their self- 
esteem on realizing the full significance of their action. While the self- 
awareness arising from some experimental situations could well produce a 
certain amount of unhappiness, my experience with experiments on aggression 
suggests that it is possible to minimize this distress with an appropriate 
explanation. Instead of focusing on what the individual himself/herself had 
done, our debriefings clearly indicate (quite accurately) that we are not 
at all interested in the subject as a distinct person; we only want to know 
how students in general behave under the conditions of our experiment. More- 
over, the subject is also assured (and again, this is usually a fairly ac- 
curate statement) that quite a few other people had acted in a similar way. 
Perhaps this is a commentary on the state of ethical judgments in our own 
society, but many persons are evidently not too unhappy about the impro- 
prieties they have committed if they are told that their behavior is quite 
common. I am not saying that this is good--or bad— only that this occurs 
very frequently. Our type of post-experimental debriefing might be criti- 
cized on ethical grounds: it helps legitimate a very questionable moral rea- 
soning. For some subjects at least, the statement might imply that it is all 
right to hurt another individual (or steal or lie) if lots of others do the 
same thing. We obviously do not want to impart this lesson. What we are 
trying to do, and I think with some success, is minimize the chances that the 
research participants will experience a loss of self-esteem on being reminded 
by the investigator that they had exhibited antisocial conduct. 



24-9 



All post-experimental explanations obviously are not alike. Yet several 
studies of the effects of debriefings indicate that they can do a great deal 
to alleviate the unpleasant tension that might have been produced in the 
course of the study. Some observations recorded by Clark after his experi- 
ment with Word (1972) are fairly typical. The subjects in this study were 
led to hear a staged accident under various circumstances and then were 
watched to see if they would aid the supposed victim. Although the exact 
level varied somewhat with the experimental condition, about a third of the 
participants reported either being "very" or "mildly" upset at the time of 
the emergency if this emergency was unambiguous. However, when the experi- 
mental ruse was explained to them at the end of the session, "80% reported 
no longer being upset, 19% were still mildly upset and only 1 S^ indicated 
he was still very upset." The investigators also assessed the views of all 
their subjects regarding the value of this kind of research: 

"The overwhelming majority of Ss (95% and 94% respectively) 
either agreed or strongly agreed that this type of research 
is valuable and that the deception practiced was unavoida- 
ble. While there was a more diverse feeling expressed con- 
cerning the ethics involved, only 2% of the Ss reported 
being opposed to the use of stress in psychological ex- 
periments. These findings provide evidence that the parti- 
cipants in these studies felt that the potential worth of 
the research outweighed the negative effects of the stress 
of deception inherent in the situation." 

Berscheid and her colleagues (1973) have published similar observations. For 
one thing, they tell us of a study by Ring and others which essentially repli- 
cated Mil gram's obedience experiment: 

"After actually participating in the replication, the sub- 
jects completed a questionnaire in which their candid 
reactions to the experiment were solicited. Some of the 
subjects were given debriefing information before filling 



24-10 



out the questionnaire; others were not. The questionnaire 
was presented to the subjects as an attempt to determine 
'whether any experiments in which you've participated in 
any way violate the rights of subjects ... ' 

... 4% of the Ring et al . subjects who had received de- 
briefing information indicated that they regretted they 
had participated in the experiment; on a related dependent 
measure, 4% of the debriefed subjects indicated the experi- 
ment should not be permitted to continue. The correspond- 
ing percentages for subjects who had not received debrief- 
ing information were 43% and 57%, or, on the average, 50%. 
Debriefing, thus, had a substantial amelioration effect 
on the subjects who actually participated in this replica- 
tion of the Milgram paradigm" (cited in Berscheid et al . , 
1973, p. 922)." 

In their own investigation Berscheid and her associates provided university 
students with detailed descriptions of several well-known social psychological 
experiments, including the one by Milgram, asked them to imagine taking part 
in each of the studies, and then gave some of these people information about 
the true purpose of the described research as well as the deceptions that had 
been practiced. This debriefing significantly affected the students' reac- 
tions to the most stressful experiments in the series. Although the results 
differed somewhat from one questionnaire measure to another, the explanation 
given the subjects about the stressful experiments raised their reported hap- 
piness and satisfaction with themselves to the level produced by the non- 
stressful studies. The debriefing information had apparently countered much 
of the felt tension created by the stressful procedures. 

These findings taken together probably reflect what post-experimental 
explanations can do, and not necessarily what they will do in every instance. 
Some investigators obviously will present a more adequate account than will 
others, and all of the participants will not find the explanation equally 



24-11 



beneficial. Still, both theory and research indicate that debriefing can 
lessen many of the psychological ill-effects that might have been created by 
the experimental procedure, including the subterfuges practiced by the re- 
searcher. 

There is another point that should be raised here. As I mentioned 
earlier, some of the objections leveled against psychological experimentation 
have assumed that whatever adverse consequences result from the treatment 
given the participants, whether anxiety, anger or a bruised ego, might well 
last for a considerable period of time. An individual might not have only 
a brief, trivial experience when he takes part in an experiment. This is 
conceivable, certainly, but in the great majority of cases, I am convinced, 
subjects do not give the laboratory happenings much thought when they are 
over. The event is finished. What had taken place is usually quite unim- 
portant to them, and they soon turn their minds to other things. The experi- 
menter's account of the study probably helps them do this. Their behavior 
is translated into something that might be of interest to the investigator 
but is not particularly relevant to their own goals. And it does not matter 
much to them that the experimenter had fooled them for his own purposes. 

Despite all this, some people could be hurt by their participation in the 
investigation. Can we predict how many will suffer and how severe their psy- 
chological injury will be? History and research say "not very well at all." 
Experts have made very inaccurate forecasts when they were asked to anticipate 
the outcome of two controversial social psychological experiments. In the 
first of these, at the time he conducted his research on obedience to authority, 



24-12 



Mil gram asked psychiatrists and others to estimate the proportion of subjects 
who would yield to the authority's (i.e., the experimenter's) dictates and 
severely punish the supposedly hapless victim. Although fully 65% of the 
subjects obeyed their instructions and increased their punishment up to the 
maximum, and ostensibly dangerous, level, most of the behavioral science 
specialists had thought that only a small minority would do so. The members 
of the Stanford University Committee on Human Experimentation also failed to 
forecast the impact of social roles on subjects in Zimbardo's simulation of 
prisons (Zimbardo, Banks et al., 1973). In this latter study one group of 
students role-played being guards in a prison-like environment for eight 
hours a day over three shifts, while other men acted as the prisoners for 
24 hours a day. Close observation of the participants as well as their 
self-reports indicated that "this simulated environment was sufficiently 
realistic and forceful to elicit intense, personal and often pathological 
reactions from the majority" (Zimbardo, 1973). As a result, the investiga- 
tors terminated the experiment well before they originally intended. And 
yet the Stanford Committee had previously approved the research proposal be- 
cause the members had not expected these strong reactions. 

Let us look more closely at these two examples of the experts' failure 
to predict people's responses to role demands. The outside observers had not 
been wrong because they had given the investigator the benefit of the doubt, 
exhibiting a willingness to try out the experimental treatments. Rather, 
their theory of human behavior was in error; they had not given sufficient 
weight to the situational influences impinging on the participants, incor- 
rectly assuming that the subjects would remain almost impervious to these 



24-13 



external forces. In this regard I agree with several other writers who have 
argued that some of the outcry against the Milgram and Zimbardo experiments 
reflects dismay at the demonstration of the power of environmental condi- 
tions over human behavior. Milgram' s research probably would have been 
criticized much less severely if his subjects had generally resisted the 
authority's pressure. As Helmreich, Bakeman and Scherwitz (1973) put it: 

"The upset generated by a Milgram or Zimbardo, both from 
the public and from their colleagues, in part stems from 
ethical concerns. But another part of their power lies 
precisely in their demonstration of how strong situa- 
tional determinants are in shaping behavior ... Milgram's 
and Zimbardo's studies evoke public outcry in part because, 
through shaming demonstrations, they remind us just how 
fragile our ethical independence and integrity really are." 

Phrasing this type of error somewhat abstractly, it appears that in their 
judgments the specialists had placed too much weight on internal determinants 
of behavior and had unduly minimized the degree to which situational factors 
affect conduct. Or to say this in another way, the observers had not ade- 
quately recognized the substantial variability in human behavior, the extent 
to which action changes from one environment to another. Walter Mischel 
(1968), an eminent writer on psychological assessments, has noted that expert 
psychologists frequently make this mistake. 

This slighting of situational variability also occurs, in a sense, when 
people exaggerate the impact of a single event upon the individual. It is not 
altogether inappropriate, I believe, to regard a person as something like a 
shoot of bamboo. Winds (situational influences) affect the bamboo (the per- 
son) fairly easily and move it about often, first in one direction and then 
another. Yet the basic structure of the bamboo (the individual's personality) 



24-14 



is not altered so readily. In minimizing situational variability we essen- 
tially deny the individual's flexibility , the degree to which he responds 
frequently to environmental stimulation without undergoing a drastic and 
persistent change. Observers also neglect this flexibility when, as I 
commented earlier, they assume that one occurrence, such as a stressful 
treatment in a psychology experiment, will modify the subject's personality 
for a long time afterwards. There can be differences of opinion as to just 
how flexible humans ordinarily are, but I think most people are more in- 
clined to view the personality as relatively fixed and yet fragile than as 
flexible and reactive but still not easily altered in any fundamental way. 

The particular conception of the human personality that we employ guides 
our thinking about the ethical issues in behavioral research. I'll highlight 
what I have in mind here by referring to a research proposal that was recent- 
ly made in England. A social psychologist wished to test his theoretical 
analysis of illegal behavior by placing teenage boys in a laboratory setting 
and then giving them an opportunity to steal money. The psychologist thought 
he would drive a van to certain working class areas of a community, recruit 
adolescents individually to work on an ostensible laboratory task inside the 
van, and then leave each boy alone with a chance to steal some cash. The 
youngsters would not know the actual purpose of the study or that they were 
actually being watched from behind a screen to see what they would do. As 
the psychologist noted in his proposal, this type of laboratory experimenta- 
tion would yield the clearest answers to the theoretical questions he was 
posing and therefore might well have direct social benefits. The granting 
agencies he approached, however, turned him down on ethical grounds. They 



24-15 



seemed to be mainly afraid that the experimental experience would strengthen 
the teenagers' antisocial tendencies, perhaps by reinforcing their inclina- 
tion to steal again in other situations. While this is a reasonable basis 
for concern, the psychologist who made the proposal believes the granting 
agencies' fears were much too strong. He thinks that most of the adoles- 
cents in his sample would have already done some stealing prior to the ex- 
periment (because of the neighborhoods from which they were recruited) so 
that their laboratory behavior would be, for them, just one more petty 
theft. He doubts whether this single experience would have had any real 
effect on the subjects' habitual mode of conduct. 

I agree with him by and large. However, none of us can guarantee that 
there definitely would not be any increase in the probability of further anti- 
social conduct as a result of the boys' participation in the study. The 
granting agencies' anxiety might be excessive; maybe they assumed that, say, 
10 boys in 100 would have been affected by this experience where, let us 
suppose, only less than one percent of the subjects would actually exhibit 
a heightened likelihood of more thievery. Is not that small increment still 
too much, especially considering the possible consequences? Do the conceiva- 
ble benefits outweigh these possible costs? Who can say with any certainty? 

Now let me get back to the matter of the inaccurate predictions of the 
outcome of the research. I have been arguing that even experts are often 
unable to foretell the results of many behavioral science experiments be- 
cause of the uncertainties and complexities in human behavior and because 
their thinking about behavior frequently disregards human flexibility and 



24-16 



the force of situational influence. In the two examples I cited, the Milgram 
and Zimbardo studies, the specialists had not anticipated the controversial 
aspects of the research (as seen by later observers), probably partly because 
they had slighted situational determinants. This failure might be regarded 
as an error in favor of the investigators; they were, or would have been, 
permitted to carry out their experiments. However, human experimentation 
review panels are also susceptible to other kinds of errors that could act 
against the researcher being allowed to conduct his investigation. 

What are the members of such a committee asked to do when they judge a 
research proposal? At times they have to assess an experimental procedure 
in light of fairly definite knowledge: will the subject be required to do 
something that is illegal (such as smoke marijuana) or that might get him 
into difficulty with legal authorities (for example, by admitting that he 
has smoked marijuana often) or that is very likely to produce physical in- 
jury (maybe by keeping his hand in ice cold water for too long a period of 
time)? The judgments the committee makes on the basis of this kind of know- 
ledge rarely produce strong objections. Quarrels are much more apt to re- 
sult, of course, when the review panel tries to estimate the stressful ness 
of a particular experimental treatment on the basis of very imperfect know- 
ledge and little, if any, prior experience with this technique. Here the 
committee members have to make a behavioral prediction when the stimulus 
situation and the action are quite ambiguous to them. 

Various biases can affect the panelists' forecasts. What is most rele- 
vant to us, I think, is the influence of the judges' set. To a wery considera- 
ble extent our interpretation of an uncertain occurrence is greatly shaped by 

24-17 



the ideas that we happen to have in mind at the time (Bruner, 1957). Thus, 
if a person has been exposed to a great many threats in the past, at a later 
time he will be quick to interpret an ambiguous event as also threatening. 
If he has been insulted frequently, he will be inclined to think that an 
ambiguous encounter is one more insult. Behavioral scientists are not im- 
mune from these perception-distorting biases. Specialists in personality 
testing often exaggerate the signs of psychopathology in a test protocol 
(Cronbach, 1970). Psychopathology is so much in their thoughts that they 
may at times be overly sensitive to indications of abnormality and are 
too ready to interpret a strange response as a sign of serious illness. 
They make too much of what might actually be only a small and fairly unim- 
portant detail . 

I suggest that a similar phenomenon is apt to occur as a consequence of 
repeated considerations of the risks in experimental research. The more often 
people have to assess the possible dangers in an experimental procedure, the 
greater is the likelihood that ideas of threat and risk will be in their minds 
when they evaluate any given proposal. And as a result, they may be overly 
inclined to interpret an ambiguous experimental technique as a stressful one. 
Here too, they may make too much of something. Has this indeed happened to 
human experimentation review panels? If these committees are becoming in- 
creasingly cautious as they carry out their duties, is this because they have 
become more sensitive to the actual hazards in the proposed investigations-- 
or have they become excessively preoccupied with ideas of danger so that they 
quickly interpret an ambiguous procedure as "probably risky" and then exag- 
gerate the possible costs to the subjects? 



24-18 



Most discussions of the ethical ity of human research have noted that 

the investigator might well be a biased judge of the risks inherent in his 

proposed study. As the Cook Committee observed in its report to the American 

Psychological Association: 

"The investigator should not trust his own objectivity in 
balancing the pros and cons of going ahead with research 
that raises an ethical question for him. His personal in- 
volvement tends to lead him to exaggerate the scientific 
merit of what he is about to do and to underestimate the 
costs to the research participant" (Cook et al . , 1973, 
p. 12). 

Yet the investigator is by no means the only one whose judgment can be biased. 
Review committees can also have a tendency to err but in the opposite direction, 
They may not want to be unfair to the researcher and may try hard to be dis- 
passionate in their evaluation of his planned study. They are not motivated 
to block his endeavors. But still, they could become overly sensitized to 
possible risks and see hazards that do not actually occur to the research 
participants simply as a result of their committee work. 

Without much hard evidence, I suspect that professional ethicists are 
also likely to exhibit this oversensitization. In my discussions about the 
use of deceptions in social psychological experiments with friends at Wisconsin 
who are philosophers of ethics I have been impressed with the way their weigh- 
ing of the costs of the research does not seem to parallel the weights em- 
ployed by our student subjects. For one thing, they tend to regard mislead- 
ing statements and subterfuges in research somewhat more harshly than do most 
of our subjects; as I noted earlier, the great majority of our subjects ap- 
parently view these deceptions as appropriate within the context of a scien- 
tific experiment. These ethicists are also inclined to see a possibly stress- 



24-19 



ful experimental technique as being harder on the subjects than do the sub- 
jects themselves. Once, when I made this observation to an ethicist, he sug- 
gested that the participants might feel intimidated by us, much the way poor 
blacks in the Deep South have resented their treatment at the hands of whites 
but were afraid to speak up. This analogy is quite imperfect, of course. 
Blacks might have been reluctant to complain directly to whites but they still 
expressed their feelings to each other. Psychology students do talk to each 
other about experiments but we have never heard that they were annoyed by the 
ruses and deceptions practiced on them. They occasionally complain about 
what they think is an excessively boring and trivial investigation, but I have 
not heard of student muttering about a stressful procedure that was reasonably 
explained to them. All in all, some aspects of social psychological experi- 
ments are evidently much more unpleasant to these particular philosophers (at 
least) than to the young men and women who actually serve in the studies. Ethi- 
cists are adept at analyzing the ethical issues in controversial problem situa- 
tions. Nonetheless, their training and experience might also cause them to 
exaggerate the costs of a given experiment to the participants. 

Who is in the best position to predict these costs? I do not believe that 
the investigator should be ruled out altogether. While his judgment could be 
biased, he is usually also the person with the greatest amount of experience 
with the research procedure in question. If he has carried out similar stu- 
dies in the past with the same techniques, he is more likely than the members 
of the review committee to know whether his procedures actually do disturb the 
participants. Serious consideration should obviously be given to this know- 
ledge. Yet his judgments of the costs and benefits can admittedly be distorted 



24-20 



by his personal and professional desires. The best solution, it seems to me, 
i s to obtain reactions from observers drawn from the same population as the 
research participants . 

Various writers have also advanced this notion. The Cook Committee of 
the APA implicitly argued that research evaluations sbould be obtained from 
judges who are similar to the subjects when it discussed the reason why the 
investigator's bias had to be corrected: The researcher "may be hindered 
from seeing costs from the subject's point of view, because of differences 
in age, economic and social background, intellectual orientation, and rela- 
tionship to the project itself" (Cook et al . , 1973, p. 12). As a result of 
his experience with his simulated prison study, Zimbardo (1973) also con- 
cluded that "students or representatives of the population being studied" 
should be part of the institutional committee passing on the ethics of human 
experimentation. Berscheid, Baron, Dermer and Libman (1973) believed that 
the use of representative samples would even permit evaluation committees to 
estimate the percentage of research participants who would object to serving 
in a given study: 

"... draw a sample from the proposed subject population, 
present it with the full procedure to be followed in the 
experiment along with the purpose of the experimentation 
and determine the extent to which these subjects would 
be willing to participate in the experiment described 
... From this 'role-play-sampling' procedure, consent 
rates could be projected for the subject population" 
(Berscheid et al., 1973, p. 914). 

I would not care to follow Berscheid' s recommendation to the letter. If this 

strategy had to be carried out for every proposed investigation, research 

would become much more expensive in money, time and effort. Moreover, how can 



24-21 



we establish an amount and intensity of consent that would be consistent and 
yet reasonable for every study? Should an experiment be halted if five per- 
cent object or four percent? What if ten percent of the participant sample 
express misgivings but only tentatively? Is this better or worse than five 
percent objecting strongly? Only a rigid and expensive bureaucracy could 
deal consistently with these questions and the other problems that inevita- 
bly would arise if every research proposal had to be screened by a sample 
representing the research participants. Then too, as Berscheid and her assoc- 
iates recognized (1973, p. 914), their recommended procedure is open to the 
criticisms that have been lodged against role-playing techniques generally. 

Let me digress for a moment to take up this particular matter for we 
have here an issue that is closely associated with the attacks on deception 
in psychological research. If it is ethically wrong and methodologically bad 
to fool subjects in an experiment, as some writers have charged, what kinds 
of investigations should be conducted? Kelman (1967), among others, offered 
an answer. The researcher should not attempt to arouse the actual attitudi- 
nal or emotional state that he wishes to study; this probably would require 
subterfuges. Instead, he should merely describe a situation to his research 
participants in which this psychological state is likely to exist and ask 
them how they would behave. The subjects play the role of a person in that 
situation rather than being actually exposed to the relevant condition. In 
the Berscheid procedure the participant samples are asked to play the role 
of someone receiving a particular treatment and then indicate how they think 
they would respond. 



24-22 



Freedman (1969) has pointed out the shortcomings in this role-playing 
technique. He noted, for one thing, that relatively few people care to ad- 
mit they would act in a socially disapproved fashion even though many of them 
actually do so at times. When he describes the Milgram obedience setting to 
his students, none of them say they would administer the extremely severe 
punishment demanded by the authority and yet a majority of Mil gram's subjects 
had compiled with the authority's dictates. It amounts to this: "sometimes 
subjects can guess accurately how they would behave; sometimes they cannot. 
Any time subtle factors or interactions are involved, any time actual be- 
havior runs counter to what is considered socially desirable or acceptable, 
guesses will probably tend to be wrong. But, most important, one can newer 
know ahead of time whether the guess is right or wrong until the people are 
observed in the real situation ... The argument comes down to the simple 
truth that data from role-playing studies consist of what some group sub- 
jects guesses would be their reaction to a particular stimulus. The sub- 
jects are giving their estimates, their intuitions, their insights, and 
introspections about themselves or others. If we are studying the myths 
and values of a society, these data would be useful. If we want to know 
how people actually behave, they are, at best, suggestive. If we are in- 
terested in people's intuitions, fine; if we are interested in their beha- 
vior (other than guessing behavior), we must ordinarily use the experimental 
method" (Freedman, 1969, pp. 110-111). 

Several direct comparisons of the results obtained by role playing and 
deception procedures have generally confirmed Freedman's observations (e.g., 
Willis & Willis, 1970). Sometimes people's estimates of how they would react 



24-23 



to a hypothetical situation faithfully mirror the behavioral of those in 
the actual situation; they are familiar with this type of condition, are 
aware of how they had responded in the past, and are not motivated to distort 
their reports. At other times, however, the role-playing subjects' guesses 
do not parallel actual behavior because they lack the requisite experience 
and/or awareness, or are trying to present themselves in a favorable light 
and this is easier to do in the role-playing than in the more spontaneous 
experimental situation. In sum, we cannot be sure when the guesses are 
right. We could not always tell whether the participant samples' reactions 
to the described situation accurately reflected the actual subjects' feel- 
ings. 

Than chances are, nevertheless, that judges drawn from the same popu- 
lation as the research participants would offer better estimates of the 
Tatters' reaction to the experimental treatments than would others of a 
dissimilar age and background. An institutional human subjects review 
panel would be well-advised to obtain "input" from representatives of the 
population being studied. Here too, though, I would recommend a fairly 
frequent replacement of the panel membership . Just as those who are repeat- 
edly engaged in assessing the risks in behavioral research might become over- 
ly inclined to see hazards in the ambiguous research settings, so might the 
participant-representatives become overly sensitized. With continued ex- 
perience on the committee their ability to mirror the participant popula- 
tion faithfully therefore declines—because of their increased sophisti- 
cation as well as the possible hypersensitivity to possible risks. 



24-24 



The thrust of my argument so far is that most criticisms of the ethi- 
cal ity of human experimentation in the behavioral sciences are based on exag- 
gerated fears. This does not mean that it is not necessary to obtain the in- 
formed consent of the research participants before they are exposed to the 
investigation. Together with practically every other behavioral researcher, 
I subscribe to the statement made by the Cook Committee: 

"The psychologist's ethical obligation to use people as 
research participants only if they give their informed 
consent rests on well -establ i shed traditions of research 
ethics and on strong rational grounds. The individual's 
human right of free choice requires that his decision to 
participate may be made in the light of adequate and 
accurate information" (Cook et al . , 1973, p. 27). 

The question is, what kind of information should be provided? As the APA 
committee observed, "Ethical problems arise because the requirements of effec- 
tive psychological research often conflict with the simple fulfillment of this 
obligation to obtain informed consent." How can this conflict be resolved? 
Indeed, is there any definite solution? 

Let us begin this discussion with the time the investigator first encoun- 
ters the research participants. The initial question is whether the researcher 
is obligated to inform his subjects that he is studying them. There is little 
problem here (for our present purposes) when the participants are volunteers. 
They know they are in an investigation. However, what if the researcher wants 
to observe people in naturalistic settings? Does he have to tell them of his 
interest and purposes? As the Cook Committee observed, "the boundary between 
drawing legitimately on one's everyday experience and spying is a narrow one. 
Some critics feel that the investigator who invades private situations under 
false pretences or with concealed observation is entirely out of bounds; others 



24-25 



feel that there are problems and circumstances in regard to which it may be 
warranted" (p. 32). I am in this latter group. 

Suppose a sociologist was interested in the interactions among guests at 
cocktail parties. Let us say that he simply recorded his general impressions 
after each party he attended and then pulled his observations together some- 
time later in an overall report. None of the guests can be identified in 
this report. In this case I would say that the investigator is not ethically 
bound to announce his research intentions every time he goes to a party. Re- 
quiring him to declare his purposes would also mean that every writer should 
proclaim his professional role whenever he met other people. The writer, 
like our sociologist, stores his impressions in his memory and then employs 
these recollections in one way or another in a later story, article or book. 
A novelist is basically no different from a sociologist in this regard even 
if the latter tallied the frequency of certain acts and the novelist only 
formed vague judgments of how frequently something was done. Both seek to 
portray an aspect of social reality. Nor does it matter, I believe, what 
their intentions were when they entered a social situation. A writer, whether 
he makes up stories or conveys a group's ideas, sooner or later will use his 
experiences in some fashion in his work. It may be the experiences of the 
moment if they strike his fancy or seem important to him. And continuing 
in the same vein, I do not think we can differentiate between the sociolo- 
gist and the writer when the portraits they draw are unfavorable to a parti- 
cular group. A novelist does not have to identify himself to those he meets 
even if he will eventually satirize their way of life, and the sociologist 



24-26 



does not have to say what he is doing although his report may have negative 
things to say about people who go to cocktail parties. 

Neither the present sociologist or novelist manipulate anything in the 
course of their observations. The problem becomes somewhat more complicated 
when the investigation produces a substantial variation in the lives of the 
participants. Sometimes this is unintended, but at other times the altera- 
tion may be deliberate, as when a field experiment is conducted. The book 
"When Prophecy Fails," by Festinger, Schachter and Riecken illustrates the 
complexities in the former type of research. In order to test their analysis 
of what happens after a failure to confirm a strongly held belief, the in- 
vestigators sent several participant observers to join a group of persons 
in a nearby community who predicted that the city would soon be inundated 
by a flood. Needless to say, the catastrophe did not occur and the observers 
recorded the group reactions. When the report was published the research 
was criticized by at least one behavioral scientist (Smith) on ethical grounds; 
by introducing other persons into the group who pretended they believed the 
flood prediction, the researchers might have helped support the group's be- 
lief. They therefore presumably exposed the members to a somewhat greater 
shock when the expectation was not confirmed. Well, I cannot say that I 
share the critic's misgivings in this case. The group members were in danger 
of scorn and disapproval even without the extra support introduced by the re- 
search team. Further, the critic's point could question a good many partici- 
pant-observation studies. From my perspective the gains that might result 
from this kind of research often outweigh the slight increment in costs pro- 
duced by this type of unintentional variation. 



24-27 



But what about deliberate manipulations of the attitudes and feelings 
of people who do not know they are in an experiment? The experiment of 
Piliavin, Rodin and Piliavin (1969) is a good example. These researchers 
wanted to investigate some of the conditions affecting the willingness to 
aid a person in distress. Pursuing this aim, they staged a series of acci- 
dents in a New York subway car, varying the race of the victim (white or 
black) and whether he appeared to be drunk or a cripple. Certain naturally 
occuring variations were also examined, such as the number of onlookers in 
the car. More and more field experiments such as this one are being con- 
ducted in social psychology, covering an ever wider range of research ques- 
tions and settings. I view most of these studies as legitimate enterprises. 
Although, it is true that the research participants are being manipulated 
by the investigators, they (a) are confronted by the kind of situation that 
could easily occur naturally in their environment, and do not realize that 
their attitudes are being operated upon. Moreover, (b) the ultimate goals 
of this research are socially quite defensible. 

These two points are fairly important, I believe. The first one means 
that the participants will not have a feeling of being pushed around and will 
have no reason to believe that their individual autonomy and dignity have 
been violated. For them, they are only facing the kind of life situation 
they might normally encounter and their habitual modes of adaptation can 
readily deal with whatever happens. They therefore should not suffer any 
loss of self-esteem. No matter what they do, whether they help or do not 
aid the victim in a Pil iavin-type situation, their customary ways of think- 
ing will tend to justify their action, and there is little likelihood that 



24-28 



they will be substantially affected. However, the research participants 
are actually being manipulated, of course, and my second point is that the 
social benefits that derive from the accumulation and dissemination of 
scientific knowledge about human behavior are greater than this relatively 
small cost. 



My argument, then, is that the people involved in most field experi- 
ments do not have to be told that they are taking part in a study. This 
is also scientifically desirable. Informing them beforehand of the experi- 
ment is very likely to produce a Hawthorne Effect. Many persons alter their 
conduct when they think they are being watched, even if the observers are 
researchers. They want to look good, gain the approval of the onlookers, 
and so they are particularly apt to do the "right thing." Consequently, 
their behavior may not be representative of how they would normally act in 
this "real world" setting. The advantages of the field experiment are 
therefore lost to the investigator. 

This reasoning obviously has implications for the debriefing procedure. 
I suggest that if the participants do not realize they are in an experiment , 
it is ordinarily unnecessary —and may even be undesirable-- to let them know 
afterwards what had actually happened . My contention is that the staged 
event will probably have only a fleeting impact on the subjects because 
their ordinary defenses and ways of thinking enable them to adopt readily 
to the occurrence. These defenses are directly confronted when the experi- 
menter reveals what he had done to the participants. Consider the subway 
riders in the Piliavin et al . study. How would they feel if the investiga- 



24-29 



tors had explained their purposes? Those subjects who had aided the "victim" 
might be pleased, of course; they had behaved in a socially approved fashion. 
But, on the other hand, what about those who had not been helpful? By talk- 
ing about the experiment, the researchers essentially tell these persons that 
they had not acted properly. Their self-esteem could then suffer. 

The reader might ask at this time, what is the difference between these 
particular research participants and the subjects in a university psychology 
experiment? Suppose the main features of the Piliavin study had been esta- 
blished under the laboratory conditions (and this has actually been done many 
times), and a subject fails to assist the individual in need. Would he not 
also experience a blow to his ego at learning afterwards that he had not acted 
in a socially responsible manner? How can we justify the post-experimental 
explanation for him, and even say that this explanation is obligatory, while 
recommending no debriefing for those taking part in most field experiments? 

The major difference, it seems to me, is that the laboratory subject 
knows he has responded to some experimental treatment. He is owed at least 
an account of the investigation in order to justify whatever coercion or pres- 
sure he felt in taking part in the study and to lessen whatever stress he 
might have experienced. The debriefing might not eliminate the ill-effects 
of the experiment altogether. There might even be a small chance that the 
researcher's revelation will wound the subject by pointing up his "bad" or 
undesirable behavior. Yet we should take this risk in order to help restore 
his sense of autonomy. If the research participants had not lost this feel- 
ing of independence, it is not necessary to expose them to the possible hazards 



24-30 



of the post-experimental explanation. They do not have to regain something 
they have not lost. 

Of course, there are times when the participants in field experiments 
should be given the same kind of careful debriefing provided to the laboratory 
subjects. In general, this is when there is some kind of indication that the 
participants had been upset, disturbed or otherwise emotionally aroused by 
the experimental procedure. There are very complex considerations and I be- 
lieve this section is best concluded with some comments made by the APA's 
Cook Committee: 

"When the man in the street becomes an unwitting partici- 
pant in research, realism has been combined with experi- 
mental control, but sometimes at considerable ethical cost. 
Informed consent is impossible. In the least questionable 
cases neither the anonymity nor the personal dignity of the 
participant is violated, and patience is only trivially 
imposed upon. But offenses to human dignity are readily 
imaginable in this sort of experimentation. As such pro- 
cedures become more numerous in an effort to obtain infor- 
mation about important social issues, there is reason to 
fear their cumulative effect ... such research can be con- 
sidered only with misgivings ... ." (Cook et al . , 1973, 
p. 33). 

Moving on to consider another aspect of the investigator's dealings with 
the research participants, we now come, finally, to the matter of information 
about people's discomforts. Virtually everyone is agreed that it is desira- 
ble to tell the potential subjects what will happen to them at the time their 
cooperation is being solicited. They should know what they will be getting 
into. HEW regulations stipulate that informed consent requires "a description 
of any attendant discomforts and risks reasonably to be expected," while the 
APA list of ethical principles includes this statement: 



24-31 



"Ethical practice requires the investigator to inform the 
participant of all features of the research that reasonably 
might be expected to influence willingness to participate 
... " (Cook et al., 1973, p. 29). 

Here too, however, a conflict can arise between this very reasonable, easily 

understandable principle and the scientific requirements of the research. 

One problem is that the potential participants might be frightened unduly. 

In another background paper to the National Commission, Robert J. Levine cites 

an experiment by Epstein and Lasagna which documents some of the perils of 

overdisclosure: 

"They presented consent forms of various lengths and 
thoroughness to prospective subjects of a drug study. 
They found that the more detail was included the more 
likely were the prospective subjects to be either con- 
fused or intimidated" (pp. 17-18). 

Could it be that the great emphasis on the possible ill-effects of the drug 
produced the same kind of overweighing of conceivable dangers that I discussed 
earlier? Just as personality testers sometimes give excessive attention to 
faint signs of psychopathology in a test protocol, the Epstein and Lasagna 
subjects might have exaggerated the hazards in taking the drug because their 
attention was focused almost exclusively on these possible risks. In much 
the same way, a behavioral scientist could arouse much too much anxiety in 
his potential subjects by over emphasizing the conceivable sources of discom- 
fort in his investigation. By enumerating everything that might possibly go 
wrong, he causes them to "accentuate the negative." 

Another problem (from the researcher's perspective) is that complete in- 
formation about every possible source of unhappiness could lessen the effec- 
tiveness of the experimental treatment. If the prospective participant was 



24-32 



told about every feature of the research that might influence his willingness 
to participate, it would be difficult (if not even impossible) to carry out 
some kinds of experiments. Researchers would probably be unable to examine 
experimentally the consequences of anger or anxiety arousal. Following the 
APA's ethical principle to the letter, the potential subjects would have to 
be informed that, say, they might be frightened (or upset or emotionally 
aroused) in the course of the study. After all, this information could 
"reasonably" affect their willingness to be in the investigation. But ob- 
viously, if the subjects had this knowledge and agreed to participate, it 
would be exceedingly difficult to create the appropriate feelings within 
them. Being forewarned, they are forearmed against the experimental treat- 
ment. 

In my view this particular principle should serve as a general guideline 

rather than as a strict rule. The Cook Committee clearly recognized this. 

After presenting the principle we are now discussing, this committee then 

went on to say: 

"When the methodological requirements of a study neces- 
sitate concealment or deception, the investigator is re- 
quired to ensure the participant's understanding of the 
reasons for this action ... " (p. 29). 

In other words, the post-experimental debriefing could compensate considerably 

for the lack of full disclosure at the time the subject's consent is obtained. 

From where I stand an appropriate compromise is to explicitly mention 
each possible source of physical discomfort (e.g., that electric shocks may 
be employed in the study) when the pre-experimental information is given, 
but not say anything at this time about the psychological manipulations 



24-33 



that will be carried out. However, and I think this is exceedingly important, 
the investigator should also emphasize that the subject is free to withdraw 
from the study at any time he wishes with full payment or credit and without 
jeopardizing his relationship with the researcher or institution. 

The reader's values obviously will determine his reaction to this kind 
of compromise or, for that matter, his response to the general trend of com- 
ments in this paper. By and large, those with a strong humanistic orienta- 
tion will be especially repelled by the idea that our research participants 
are often exposed to psychological stresses or even that the subjects' atti- 
tudes and feelings are being manipulated without their fully informed con- 
sent. I do not mean to question the desire to preserve individual dignity 
and autonomy. I do believe, nevertheless, that the advance of behavioral 
science can contribute to the preservation and strengthening of these values. 
People are being manipulated every day by forces outside of their control 
and often to their personal detriment. The development and dissemination 
of behavioral science knowledge can lead to a greater awareness of these 
influences and the steps that might be taken to counteract them. A sound 
behavioral science can help uncover the truth about determinants of human 
conduct, and as in other domains of life, the truth can make us free. 



24-34 



25 

SELECTED ISSUES IN INFORMED CONSENT AND CONFIDENTIALITY 

WITH SPECIAL REFERENCE TO BEHAVIORAL/SOCIAL 

SCIENCE RESEARCH/ INQUIRY 

Albert Reiss, Jr., Ph.D. 
February 1 , 1976 



I . INTRODUCTION 

This essay explores what Edward Shils calls the confrontation of 
autonomy and privacy by a free intellectual curiosity (1959:121). It 
does so by examining how institutions of consent and confidentiality are 
organized in behavioral science inquiry. Their role in regulating the ac- 
quisition, processing, and dissemination of knowledge is its major concern. 
Regulations instituted by the Federal Government for implementation by 
agents who sponsor or undertake sponsored inquiry are reviewed for the 
issues they present for behavioral science inquiry. Special attention is 
given to analyzing the Code of Federal Regulations for the protection of 
human subjects in research, development, and related activities supported 
by Department of Health, Education, and Welfare grants and contracts 
(45 CFR 46), the proposed code of regulations governing the confidentiality 
of individually identifiable research and statistical information collected 
under Law Enforcement Assistance grant programs (28 CFR 22) , and the proposed 
code of regulations to protect the privacy of research subjects by with- 
holding from all persons not connected with the research the names and other 
identifying characteristics of such subjects in research on mental health 
sponsored by the Department of Health, Education, and Welfare (42 CFR 2a). 

The Problem Setting 

The behavioral scientist's access to information is limited by important 
proprietary rights in information and individual and collective rights to 
secrecy and privacy. Governments assert rights to keep secret or confiden- 
tial information to protect national security and the deliberative processes 
of executive, legislative, and judicial agencies, and information on individ- 
uals or collectivities to which it is privy to insure their privacy and 



25-1 



protect their proprietary rights. Corporations and other collectivites such 
as professions and voluntary bodies have legally guaranteed proprietary 
rights to information to protect the autonomy of the organization and their 
clients' right to privacy. There are, similarly, proprietary interests for 
private persons and a right to security of private personal expression and 
affairs (Warren and Brandeis, 1890; Pound, 1915:343). 

In a free and open society, these proprietary interests and private 
rights confront public rights and claims to information. What is available 
in the public interest depends upon both law and custom, including the 
customs of a scholarly community, and its interpretation in any given case 
as to what is public and what is privileged. The federal Privacy Act 
and the Freedom of Information Act among others define rights and privileges 
in information and access to information. 

The behavioral scientist's access to information is normatively a matter 
of right to information that is public and a matter of consent where it is 
proprietary, private, or privileged. How to regulate the acquisition, 
processing, and dissemination of information is especially problematic in 
a free and open society. At the present time regulation is in a state of 
flux. Some recent federal and state laws make the information of public 
bodies more accessible to inquiry while at the same time information for 
private organizations and persons is subject to more legal, ethical, and 
organizational regulation to protect proprietary rights in information and 
corporate and individual rights to privacy. The customary fiducial relation- 
ship of scientific investigators and their sources of information is both 
subject to growing regulation in the interest of protecting the rights and 
integrity of those sources and jeopardy by the inability of investigators 
to resist efforts to break confidences or to control their misuse. Tradi- 

25-2 



tionally investigators guaranteed their sources of information the protection 
of confidentiality but the growth of legal challenges to their right to 
confidentiality threatens the foundation of their fiducial obligation. In 
what follows some issues and problems in obtaining information through a 
■fiducial relationship of consent and confidentiality are explored and 
ethical, legal, and organizational forms of regulation to protect proprietary 
rights in information, corporate and individual rights to privacy, and the 
privileges of investigators in behavioral science research are examined. 
Both trust and privilege are paradoxically elements in maintaining scientific 
inquiry in a free and open society. 

Right to Privacy 

The "right to individual privacy" has its roots in the common law 
(Warren and Brandeis, 1890) and it has gradually been extended to corporate 
bodies in one form or another. The "right to privacy" is a complex legal 
concept embracing several related concerns such as the right of individuals 
(1) to be "left alone," (2) to be secure from intrusion into private affairs 
by unwarranted means, and (3) to be secure against unauthorized entry into 
one's domicile or private place. The right extends also to proprietary 
interests in intellectual property such as trade secrets, original work 
subject to patent or copyright, and the like. Each of these rights may be 
intruded upon by behavioral science inquiry. Transgressions are not easily 
defined or recognized. Worth pondering are questions such as these: (1) is 
the entry of a research observer with a police officer into the domicile of 
a private citizen a "lawful entry"? (2) is the recording of a public meeting, 
including such 'private conversations' as may take place during the meeting, 
an intrusion by unwarranted means? (3) is privacy respected when one has 

25-3" 



the consent of an employer to secure information from the personnel records 
before information on identity of the employee is removed? 

At law, the privacy of another is invaded when there is an unreasonable 
interference in making public any affairs that a person wishes to remain 
private, A social research investigator invades privacy when he is respon- 
sible for public disclosure of private facts or when such public disclosure 
puts another in a derogatory light before the public (Goldstein, 1969:423). 
A proposed revision of the law of torts prepared by the American Law Institute 
broadens considerably the concept of invasion of privacy to include "... 
one who intentionally intrudes physically or otherwise, upon the solitude or 
seclusion of another or his private affairs or concerns ... if the intrusion 
would be highly offensive to a reasonable man" (1969:418). As Nejelski and 
Lerman note (1971:1126) intrusion upon solitude may occur when social scientists 
make unobtrusive observations and the identity of those observed becomes known. 
Whenever consent is lacking as an element in securing information on private 
matters, the investigator risks invading the privacy of others, even when 
that information is secured in public settings. Much may depend, of course, 
on the capacity of investigators to keep private information from becoming 
public knowledge. 

In much, though not all, behavioral science research, there is some 
intrusion upon the privacy of others, seem it ever so slight. Apart from the 
fact that research investigators have a legel liability to suit for invasion 
of privacy, ethical values constrain the intrusion upon privacy without 
recourse to consent or some appeal to a priority of values. We shall briefly 
examine below some of the principal criteria invoked to justify intrusion into 
private affairs, 

The typical criterion invoked is that intrusion on privacy is justified 

25-4 



in the interest of developing new knowledge or scientific knowledge . The 
criterion of "developing new knowledge" is of little utility since all knowl- 
edge is in some sense "new." Perhaps one is on somewhat firmer grounds in- 
voking the criterion of contribution to "scientific knowledge." Ordinarily 
to qualify as scientific knowledge, the study design should meet at least 
minimal criteria of scientific method. The criterion of scientific or 
methodological merit of the research design may be unduly restrictive on 
scientific exploration, however. Much exploratory social research, par- 
ticularly that by participant observation, might fail by methodological 
criteria. The issue as to whether exploratory research into private matters 
is justifiable, absent a formal design of scientific merit, merits careful 
consideration. 

Apart from the simple intrusion into the seclusion of others, intrusion 
occurs in obtaining information on the private matters of specific identifiable 
individuals. The degree to which the investigator designs instruments that 
define in advance these private matters affects the extent to which one can 
test whether the intrusion is warranted in the interest of new or scientific 
knowledge. The more unplanned and diffuse the intrusion into private matters, 
the more one is likely to probe for additional information; and, the more one 
searches for the "confidential," the more likely one is to intrude upon matters 

that are purely personal and private and perhaps more potentially damaging 

2 
to subjects or corporate bodies. That behavioral scientists may deliberately 

search for the "hidden agenda," the "latent attitudes," the evidence for 

deviance or corruption is clear from many studies. The responsibility for 

utilizing techniques of investigation that deliberately search for these 

intrusions is not commonly dealt with in reports of such intrusions; yet they 

merit careful consideration. 



25-5 



Where the intrusion is planned, careful consideration must be given to 
the trade-offs between the relative degree and cost of intrusion into the 
privacy of others and the gains from it. On what grounds does one justify 
questions about drug use, for whom one voted in the last election, or one's 
income? What will happen to the response rate if just prior to asking the 
question one advises the respondent of freedom not to answer the question? 
How much of a "no response" or refusal rate, or of what is called error in 
reporting, stems from the respondent's belief that it is a private matter 
and of no concern to the investigator? These seem like questions worth 
answering if one is to intrude upon the privacy of others. At the present 
time judgments about the relative privacy of matters cannot be scaled 
precisely and compared with judgments about the net worth of gaining that 
information. Yet whether one uses the legal criterion of objectionable 
to a "reasonable man" or an empirical criterion such as the percent of sub- 
jects objecting to the asking of, or responding to, a particular question, 
differences in the relative privacy of matters are determinable. The 
determination of the net worth of undertaking a particular investigation may 
be a more difficult task, though such judgments are commonly made in rating 
research proposals for financial support. What remains problematic, how- 
ever, for those who advance this criterion, is what criteria shall govern 
decisions to undertake research once the cost of intrusion into privacy and 
the net worth of the knowledge have been established. 

Alternatively, some investigators invoke the criterion that intrusion 
on privacy is justified when the knowledge is n ecessary to matters of public 
importance or interest . One is justified, for example, in asking questions 
about birth control, abortions, and unwanted pregnancies as essential to the 
formation of population policies. A difficulty with this criterion is that 

25-6 



so long as an investigator determines what is in the public interest, there 
can be obvious contamination of judgment. In any case, there again are no 
clear decision criteria for making judgments based on relating the relative 
public importance of matters to the relative costs of intruding into private 
matters. 

Consent . The criterion most commonly invoked by scientists to intrude 
upon privacy is that intrusion into private matters is justified for scientific 
inquiry when consent is secured for access tc these matters . Clarification 
of this criterion raises questions about who shall secure consent from whom, 
how, and with what anticipated consequences from participation. The institu- 
tional doctrine that derives from an answer to these questions is that of 
informed consent . Consent "... concerns the conditions under which informa- 
tion is obtained from a person" (Ruebhausen and Brim, 1965:1197); it is an 
affirmative agreement by free choice to provide information under stated or 
agreed upon conditions. For consent to be informed means that anyone con- 
senting must be able to predict reasonably well from a description of the 
procedure to be used in acquiring information and from such other information 
as is provided what information will be sought and what risks or benefits will 
follow from participation, given only the information provided at the time 
consent is initially requested. Formally, informed consent is an agreement 
that satisfies the conditions of an enforcable contract. The definition of 
informed consent currently operative in the regulations that are applicable 
to all Department of Health, Education, and Welfare grants and contracts sup- 
porting research, development, and related activities in which human subjects 
are involved follows (45 CFR 46.3): 

"informed consent means the knowing consent of an individual 
or his legally authorized representative, so situated as to be able 
to exercise free power of choice without undue inducement or any 
element of force, fraud, deceit, duress, or any other form of con- 

25-7" 



straint or coercion. The basic elements of information necessary 
to such consent are: 

(1) A fair explanation of the procedures to be followed, and 
their purposes, including identification of any procedures which 
are experimental; 

(2) a description of any attendant discomforts and risks 
reasonably to be expected; 

(3) a description of any benefits reasonably to be expected; 

(4) a disclosure of any appropriate alternative procedures 
that might be advantageous for the subject; 

(5) an offer to answer any inquiries concerning the procedures; 
and 

(6) an instruction that the person is free to withdraw his 
consent and to discontinue participation in the project at any 
time without prejudice to the subject. 

Each of these elements of informed consent is examined in Section II below, 

particularly as each bears upon behavioral science research. 

Confidentiality . Issues in informed consent in behavior science research 
cannot be discussed fully without reference to the question of the private 
or confidential nature of much information and its protection. Confiden- 
tiality refers to " . . . the conditions under which the information is used." 
(Ruebhausen and Brim, 1965:1197); it involves an obligation to keep private 
matters confidential and free from public disclosure unless there is consent 
from the private party to do so or some overriding collective interest to 
make such matters public. There are other reasons, however, why the matter 
of informed consent is inextricably interwoven with the confidentiality of 
information and its protection. 

First, an element in informed consent is to apprise the party from whom 
consent is sought of any risks involved from participation in the research. 
It is commonly the case in behavioral science inquiry that there is little 
harm in the procedure for acquiring information but that when harm arises 
it does so from the public disclosure of private or confidential matters 
that were communicated as a confidence. A fiducial relationship is at stake. 
Once an investigator acquires any information for social research that can 

25-8 



cause harm, as a party to that information, he is potentially an agent for 
doing harm. Where protection of confidential information cannot reasonably 
be guaranteed , an element in informed consent should be to advise that there 
is some risk of disclosure provided there is no adequate legal protection . 

It indeed can be argued that to provide adequate protection in behavioral 
science inquiry where information is acquired on private matters and there 
is no legal protection or sanctions against compelled or unauthorized dis- 
closure, it should be mandatory to inform the person from whom information 
is sought in a manner akin to that of a Miranda warning: "I must advise 
you that you have a right to refuse to participate or to answer any query 
put to you since anything you say or do can cause you harm for I cannot 
legally protect any of the information that you disclose to me, including 
the fact that you were a participant in this study." 

Second, there are risks even for the parties who refuse to participate 
in a particular behavioral science or bio-medical study should the investigator 
be legally compelled to publically disclose that fact of refusal or if it 
otherwise becomes public knowledge. Consider making public a list of persons 
who refused to participate in a study of "former patients in a drug addiction 
center," a study of "homosexual networks," or a project studying "persons 
discharged by their employer." Might not such disclosure cause considerable 
damage to reputation and substantially risk future opportunities and benefits 
as well for those who refused? A particularly thorny problem thus is raised 
about approaching persons for their consent when even the knowledge of that 
approach is potentially harmful. No Miranda type warning will suffice in such 
a situation. Where confidentiality is at issue , any protection of informed 
censent is insufficient whe n public disclosure of refusal is harmful . 

Third, where confidentiality must be maintained to protect the parties 

25-9 



from whom information is obtained, the requirement that one advise of the 
risks that might reasonably be expected may prove unusually burdensome. 
This is so for a number of related reasons. Often one lacks sufficient 
knowledge about subjects and what might prove damaging to them on disclosure. 
Although persons have a right to refuse information, if they have not done 
so, it may be a consequence of their difficulty in predicting the consequences 
of that disclosure — particularly in the prototype situations for eliciting 
information in behavioral science inquiry. Investigators often lack informa- 
tion on how the information they seek might easily turn out to be harmful, 
since there is no established knowledge in the matter and they are far from 
omnipotent. Moreover, whether individuals or collectivities are the object 
of inquiry, particular outcomes cannot be promised in many instances with 
any high degree of validity and reliability. At best one often makes only 
an "informed guess." 

Finally, behavioral science research occurs in diverse settings that 
are at best characterized as "uncontrolled" research settings. Investigators 
or their agents often must enter settings over which they have little direct 
control and usually limited indirect control. Indeed, often they may enter 
a private place where others are present and the rounds of social life go 
on. As a result of being admitted to private places or as an unintended 
consequence of a research procedure, information often is acquired that was 
not intended as part of the designed inquiry. That such information could 
be potentially harmful to the person who granted consent for a particular 
study is quite obvious. That the investigator often may not have wanted to 
become privy to the matter should be equally obvious. Yet to leave unprotected 
all information that is acquired apart from the research design set forth in 
securing informed consent is to increase the risk of harm to any participant 

25-10 



in a_ research project . Parenthetically, one might note that to leave un- 
protected the private utterances of patients being observed for post-operative 
procedure may similarly increase their risk. Unless what one becomes party 
to in a. research role is , with few exceptions such as the commission of a 
henious crime, protected , informed consent should include the advice that 
anything that is unrelated to the research inquiry which is said or that 
occurs in the presence of the (outside) investigator can be used against 
them . The dilemma this creates for all parties to the research should be 
clear, but it is particularly critical for the informants or participants. 
Unable to either forecast what will be covered by the research design or to 
fully comprehend that which is and is not in a particular instance covered 
by the research mandate, the best advice one should give prospective 
participants perhaps is to refuse to participate if for any reason the 
participant expects that any confidential information will be secured that 
may be harmful. But, in any case all parties should be aware of the fact 
that others who are not connected with the research process may decide what 
was not part of the inquiry and that all parties are unprotected in such 
matters. Without protection for confidential or private matters that are 
acquired apart from the intent of the research then, investigators should 
not only make judgments about the likelihood potentially damaging informa- 
tion might be acquired through their particular design or from the nature of 
their research settings but in any case they should advise parties they are 
so unprotected. 

There is inevitably some risk that investigators may take undue ad- 
vantage of any protection for all information secured from and about parties 
to a research inquiry. They may, for instance, use it for unauthorized 
inquiry. Such possibilities exist, but they seem hardly an argument for 

25-11 



leaving participants unprotected, particularly when they often do not 
volunteer for research but are approached for their participation. 

For these reasons then, we consider both separately and together 
matters of consent and confidentiality and their regulation. Before doing 
so we shall consider the main model that underlies the regulation of research 
by the Department of Health, Education, and Welfare. 

The Human Subject Model . Much of the writing en regulating the ac- 
quisition, processing, and dissemination of knowledge is based on an ele- 
mentary model of a principal investigator — commonly referred to as PI — 
acting upon or intervening in the life of a subject — commonly referred to 
as S. We shall speak of this as the Human Subject Model; it is the prototype 
in regulating bio-medical research. Our interest in this model here lies in 
the fact that it also underlies the Code of Federal Regulations for research 
grants and contracts of the Department of Health, Education, and Welfare 
that might be undertaken by behavioral scientists (45 CFR 46). Although 
understanding this elementary model is useful in articulating other models 
of inquiry, it oversimplifies problems and issues in informed consent and 
confidentiality in behavioral science investigations and for that matter, 
much bio-medical research as well. The Human Subject model of research is 
an oversimplification for stating rules to protect human subjects and main- 
tain free inquiry for a number of reasons. 

Much research is undertaken by a team or organization where a fairly 

large number of employees as well as investigators acquire and have access 

to information regarded as confidential. The principal investigator often 

*. 
may acquire none of the data, relying upon others to do so, and often operates 

primarily in the roles of administrator of the research and principal analyst. 

Frequently in social research, moreover, the object of the inquiry is an 

25-12 



integral social group, organization, or collectivity rather than a person 
as subject. Confidential information frequently is obtained by indirect 
rather than direct inquiry or from confidential records (Goldstein, 1969: 
417-37). Consent for access to confidential information may be sought from 
administrators of records or from parties other than those who are the 
original source of information. A growing number of studies depend upon 
systematic observation of natural social phenomena where the consent of the 
observed is not regarded as problematic. Visual and audio methods of ac- 
quiring and storing information and computer storage and processing both 
facilitate and complicate problems of identification and access to informa- 
tion. 

Suffice it to say then, that the roles and parties to research do not 
conform to the elementary model of a one-to-one investigator and subject 
relationship. The prototype model for behavioral science research perhaps 
is the sample survey. In the sample survey sampling statisticians select 
addresses of respondents who are then approached for interview by persons 
who are not subject to immediate supervision. The work product of inter- 
viewers is reviewed by a supervisor who may also make direct inquiry of the 
respondents to verify information and audit interviewer conduct. This 
information in turn is transmitted to a field office where confidential 
information may be processed by coders and analysts before identification 
is removed. Still others will prepare the data for computation and analysis 
in a chain that ends with the preparation and dissemination of research 
reports. Some, if not all, of these specialists may need to have access to 
confidential information that identifies private parties. Few respondents 
in a survey who consent to participate by being interviewed could readily com- 
prehend or become aware of this chain of accessibility to their confidence. 

25-13 



That principal investigators can guarantee confidentiality under these 
circumstances is open to question. What is remarkable perhaps is how 
little evidence there is that such trust and confidence is misused or 
broken. 

Other models exist in behavioral science research where the Human 
Subject model is a gross oversimplification. Some of these are considered 
later such as that for the systematic observation of behavior patterns and 
interactions, the study cf organizational behavior — including organizational 
processes of regulation — , and the quasi-experiment in natural social settings. 
Without explicating each of these models here, we simply ask the reader to 
bear in mind that some of the issues and problems that arise in applying 
current federal regulations of behavioral science research derive from their 
conceptualization in terms of the elementary Human Subject model. 

I I. INFORMED CONSENT 

Informed consent is said to involve "... the knowing consent of an 
individual or his legally authorized representative, so situated as to be 
able to exercise free power of choice without undue inducement or any element 
of force, fraud, deceit, duress, or other form of constraint or coercion" 
(45 CFR 46.3). 

Conditions of Consent . A strict construction of this definition would 
make it mandatory for any Institutional Review Board to decline approval for 
any proposal where there is either any "undue inducement ... or other form 
of constraint . . ." or " any element (italics mine) of force, fraud, deceit, 
duress, or other form of . . . coercion." From a behavioral science perspective, 
many research studies could not qualify for approval under a strict construction. 

25-14 



We shall try to explain why this is so. 

Criterion of Undue Inducements . At issue in the matter of undue in- 
ducements is whether inducements have an effect on choice so as to make it 
"not free." To a behavioral scientist, of course, this is in itself an 
empirical question rather than a matter of "informed judgment" and it is 
well recognized that each of the terms — undue, inducement, free, and choice — 
can be operationalized in different ways for scientific investigation. Just 
when inducements become an "undue" element influencing choice would probably 
not be altogether evident in any empirical investigation of the relationships 
between inducements and choice. Consider but one example, the question of 
whether and when subject payments for participation in an experiment or 
other scientific investigation constitute an "undue inducement." One would 
expect that members of a population would vary considerably in whether a 
given payment had a substantial effect on inducing them to participate. 
Perhaps the poorer one is, the more likely one is to opt for a given payment 
when one would otherwise have refused. The very young, the very old, and 
the unemployed may be more susceptible to any sum becoming a sufficient 
inducement to bring participation. Yet the matter is complicated and dif- 
ficult to measure. For a good many kinds of behavioral science studies most 
people, most of the time, will participate when there is only a simple 
request to do so and at the conclusion of the inquiry will express satisfac- 
tion in having done so. It will take a complex design to ferret out the 
willingness to participate in a given kind of research without any inducement- 
whatever that might be — and their willingness to participate only under a 
given level of inducement. To substitute judgment for empirical inquiry in 
such matters seems a dubious requirement since it will tend to lead to con- 
ventions about inducements that are false, with errors in both directions. 

25-15 



That is, some inducements will be tabooed on grounds that they are "undue" 
when in fact they are not while others will be approved as not being "undue," 
when in fact they are. Moreover, under a strict construction, one is barred 
from examining the question of the effect of inducements on choice to 
participate in a given kind of scientific study, since one is prohibited 
from offering "undue inducements": they are mala prohibita if not mala in se. 

Money is only one class of inducements that might have an effect on 
choice. There are many other forms of inducement or reward that vary in 
the extent to which their effects are definable and measurable. Prestige, 
offers of feedback concerning skills or personality, and opportunities to 
develop skills or secure new information can be forms of inducement that at 
least for some subjects may unduly influence their choice. 

The form of inducements can be subtle and indirect, particularly when 
peer or group interests and pressures combine in a consent procedure. A 
simple example may illustrate some dilemmas and contradictions in approving 
a consent procedure. Consider the approval of an experiment of . the effects 
of inducements on the rate of learning. The experiment provides for both 
individual and aggregate rewards for increments in the rate of learning. 
The consent of parents must be secured for the student to participate in the 
experiment. The school administration prefers that all students participate 
in the experiment as does the investigator. It is more costly to contact 
the parents directly than to do so via their children, so the latter mode of 
contact for seeking consent is approved. Moreover, it is fairly well known 
that the rate of return of consent forms is affected by factors other than 
the willingness of the parent to grant consent. Thus a procedure is approved 
whereby the parent is asked to sign the form only if they disapprove of 
participation (an approved modification of the written consent provision). 

25-16 



Apart from questions about whether this procedure balances consent 
unduly in favor of the sponsors of the research, which it well might, other 
indirect inducements may be operating. Suppose one's peers encourage 
participation in the study — even added to the prof erred inducements, the 
study, for example, provides an opportunity to be free of the daily routine. 
Under the approved procedure, some students would never show the form to 
their parents or, if the consent of both parents is not required, select the 
parent who is most amenable to their persuasion. Moreover, most parents may 
well sign without a careful reading; they succumb in the moment to the re- 
quest for a signature — "you've got to sign this, so I can take this test." 
Indeed, only empirical inquiry can shed light on how parent consent 
procedures work. We know very little about them, if for no other reason than 
that the behavioral science community, like any community, may opt for "func- 
tional ignorance." Unless approval is forthcoming for studying the effect 
of inducements on consent and unless review boards are vigilant in searching 
for indirect as well as direct forms of inducement, considerable "error" 
will attend the decisions about inducements. 

Criterion of Coercion: Force, Fraud, Deceit, Duress . Institutional 
Review Boards are required under the strict construction to disapprove any 
research project where there is any element of coercion by force, fraud, 
deceit, duress, or other means. Except for research designs that require 
deception in soliciting consent, the direct use of force, fraud, or duress 
by investigators in soliciting consent is uncommon. There are, however, some- 
what more studies where force or duress is an element that may affect the 
continuing grant of consent during the inquiry by applying pressure against 
withdrawal. Again such forms of force and duress are less likely to be direct 
manipulations by investigators than consequences of the procedure or of the 

25-17 



very phenomenon that is under Investigation. The Milgram ( ) and 
Zimbardo ( ) experiments are but obvious examples of such elements 
operating before and during the inquiry because the elements of coercion 
and duress were themselves objects of investigation. 

Yet it is the less obvious sources that pose difficulty for Institu- 
tional Review Boards in behavioral science inquiry, particularly if one is 
careful to insure that there is freedom not only to enter the research 
relationship but to refuse to respond to specific inquiries for information 
and to terminate at any time the relationship altogether. These may very 
well be stages in processes of social engagement and disengagement. Once 
committed by the initial consent procedure, fiducial relationships are 
not easily broken. A person may prefer deliberate deceit in reply over a 
refusal to answer or termination of consent — an interesting moral dilemma — 
or one may give truthful answers that would not be given were it not for the 
"threshold problems" in breaking a fiducial relationship. 

It is reasonably well established that groups have considerable 
power over their membership by legitimating forms of coercion or duress. 
These very elements may be incorporated as features in a study design, either 
procedurally or as objects of inquiry. A few examples may illustrate th« kinds 
of decision problems that might arise for Institutional Review Boards: 

(1) Using Group Techniques . Many forms of group therapy or change 
depend upon group processes where force and duress are elements of group 
process. Such techniques are also used simply to acquire information on 
group processes. Coercive pressures from the group to continue in the face 
of any member's wish to withdraw are particularly common. They are more 
evident, for example, in the use of Tavistock than NTL group techniques but 
often arise in group settings as a consequence of the procedural mode of 

25-18 



inquiry or the study design. Even where the procedure is described in 
advance, and consent is given, the experience under group pressure may have 
a substantial effect on the choice to withdraw. 

(2) Using Contract to Secure Information . While the elements of 
contract may be present in many consent procedures, e.g., paying subjects 
to participate or offering some other benefit directly to the participant, 
under certain circumstances formal contract is an element in behavioral 
science research. This is not uncommonly the case in evaluation research 
of government programs where federal legislation and policy makes funding 
contingent upon agreement to outside evaluation. Under these conditions the 
choice to withdraw is constrained by formal contract >?md indeed the cost of 
doing so may be coercive in continuing participation. Employees of programs 
being evaluated may similarly contract for participation in the evaluation 
as a condition of employment in the program. Where formal contract governs 
requirements for participation, the element of free choice to participate 
and withdraw may inevitably be compromised. The need to evaluate federal 
programs given their costs and consequences may be deemed compelling in the 
resort to contract. Perhaps some guidelines for the use of formal contract 
for organizations and their agents is necessary to guide the discretionary 
choices of Institutional Review Boards. 

(3) Using Organizational Sponsorship and Participation . Parties 
providing information on private matters or in their organizational roles — 
as officials, clients, agents, or employees — must be given to understand 
that their failure to participate in no way jeopardizes them or their af- 

f illative relationship. The condition is not easily satisfied. Is there 
no element of coercion when students in courses are asked to participate in 
research? when anyone superior in a hierarchy of authority asks an inferior 

25-19 



to participate? when an organizational decision or formal agreement to 
participate precedes the request for consent from individual participants? 
Whenever an organization stands to benefit from feedback generated in a 
behavioral science inquiry, it has an incentive to agree to and encourage 
•participation from its members. When is encouragement not coercion? 
Equally important to the understanding of the effects of organizational power 
on member participation, is the question of whether there are ways of 
eliminating all effects of organizational power. It seems possible to 
reduce such effects when present but their elimination, as the strict 
construction implies, seems doubtful. Thus, while I know of ways that I 
can minimize the effects of teacher power over students and still have them 
participate in teacher sponsored research, in each case there is still the 
possibility that residual elements of coercion exist. 

Apart from the direct effect of organizational sponsorship and participa- 
tion on the consent of members, there may be indirect effects of organiza- 
tional power in the form of legitimated authority or power. It has often 
been observed that surveys under government auspices and administration have 
response rates well above those of private organizations. While there is a 
common belief that this is partly owing to the incremental effect of govern- 
ment authority as a prestigeful and legitimate source obligating compliance 
or to the effect of coercive anxiety that failure to comply might jeopardize 
other relations with or benefits from government agencies, it is difficult 
to disentagle any such effects from one another- and from other possible 
effects such as differences in the training of survey interviewers, of 
organizational resources, and so on. Whether Institutional Review Boards 
should approve all forms of legitimating auspices that may have coercive or 
inducement effects is problematic; even its own institution may have 

25-20 



legitimating properties that affect participation. 

(4) Using Particular Methods for Eliciting Information . Methods for 
eliciting information must be free of all elements of coercion and any un- 
due inducement if there is to be free choice in providing information. 
•There is considerable variation in techniques for eliciting information 
and the conditions for complying with the task of providing information. 
They must vary considerably in their coercive features; little is known 
about this variation from past research. One wonders, for example, whether 
there would be differences in responding to a typical survey question on 
private matters if options were routinely given to respond that it is 
a private matter. Again, it should be noted that we know all too little 
about how much falsification there is in responses to interview or test 
questions about private matters because respondents feel too embarassed or 
constrained to say that it is "nobody's business" or that because the 
information is requested, they wish to withdraw their consent to continue 
in the survey. 

There are other and perhaps more subtle ways that procedures for 
eliciting information coerce or constrain responses. Interviewers are 
trained to induce "cooperation," develop "rapport," or lead into a sensitive 
area of privacy. Instruments are designed to subtly lead up to the eliciting 
of such information; ways of indirectly measuring such responses are not 
uncommon. Thus one would usually not ask respondents whether they are prej- 
udiced towards members of a particular minority or have discriminated against 
them in the past; more indirect methods would be used. 

Inducement and subtle forms of coercion are not necessarily evident 
even to investigators and only a careful examination of how respondents 
perceive or interpret the procedure and other elements of the inquiry may 

25-21 



disclose them. Coercive techniques and inducements to cooperate in an 
inquiry, moreover, are not equally operative for all members of a study 
population. There is some evidence that the less educated and underclasses 
are more likely to be induced into consent out of ignorance or misunder- 
standing as to what they are free to do than are others. In general, the 
more the power between the investigator and the sources of information is 
balanced in favor of the sources of information, the less certain is any 
investigator to gain information on private matters. It would seem, for 
example, that it is easier- to acquire information on theft and fraud 
from low than high income respondents. For that reason, a study of shop- 
lifting may be more successfully completed with consent from respondents 
than let us say a study of income tax- evasion. One might ponder whether 
many study designs should not be approved until the matter of the effect 
of inducements on consent is itself investigated, but that of course 
entails a relaxation of the strict construction. 

The fraudulent use of trust is protected at law but the more common 
forms of deception that are practiced in social research may lie outside 
legal protection. While the matter of deception is explored in a separate 
paper for the Commission, a few additional observations are offered here 
since they relate to the matter of explaining procedures and measurement 
that must be communicated in obtaining informed consent. We shall set 
aside for these purposes the question of whether they cause individual harm 
and take those instances where the potential for harm is largely absent or 
minimal, particularly if there are legal protections for confidentiality 
and the possibility of social benefits is reasonably substantial. 

It is a commonplace in behavioral science research that persons are 
likely to give "expected" or "socially desirable" responses to questions 

25-22 



rather than their true response. There is, moreover, a strong tendency 
for respondents to cloak socially undesirable responses or behavior or at 
least not to disclose them to persons who are not known to them. Both of 
these events pose problems for social measurement so that procedures are 
designed to measure without subject awareness of the intent of the measure. 
Thus there are techniques for determining whether a given respondent is 
falsifying responses and ways of measuring socially undesirable attitudes or 
conduct indirectly. Turning again to the study of prejudice and discrimina- 
tion, it should be evident that both prejudice and discrimination are more 
likely to be measured indirectly rather than directly and certainly not 
directly if the identity of a subject is to be known to the inquirer. 

The problem is further complicated by quasi-experiments in natural 
social settings, particularly public settings where the observation of 
behavior may be recorded. A great deal was learned in just such experiments 
about discrimination in employment, housing, and public accommodations or 
facilities. Such experiments not uncommonly are conducted using members 
of minority and majority groups as paid participants in the experiment and 
observing the responses that others make to their behavior. Through the 
Civil Rights Movement and subsequent legitimation in legislation, such 
techniques can be practiced by operating organizations as a means of 
gathering intelligence to enforce civil rights laws. Whether they should be 
precluded in research because they involve an element of deception is moot. 
Indeed, as we shall note later, the explicit obligation to disclose any 
procedures that are experimental (45 CFR 46.3:c-l) when coupled with a 
prohibition against deception could seriously jeopardize the status of 
social experiments in social problems research. 

25-23 



Who Must Consent? 

The HEW Cede of Federal regulations appears to stipulate that informed 
consent must only be secured for research in which "subjects are at risk." 
A "Subject at risk means any individual who may be exposed to the possibility 
of injury, including physical, psychological, or social injury, as a conse- 
quence of participation as a subject in any research development, or related 
activity which departs from the application of those established and accepted 
methods necessary to meet his needs, or which increases the ordinary risks 
of daily life, including the recognized risks inherent in a chosen occupa- 
tion or field of service." (45 CFR 46.3:b). The operable provision for much 
behavioral science research is that relating to the research increasing 
the ordinary risks of daily life. Since behavioral scientists ordinarily 
deal in information processing rather than in manipulation of human subjects 
and social groups, the risk of disclosure of elicited information presumably 
increases the ordinary risks of daily life. Under that interpretation most 
procedures that elicit information require informed consent. 

The question arises, however, whether exemption can be granted whenever 
a person is not exposed to risk as a consequence of "participation as a 
subject in any research, development, or related activity." Put another way, 
when is a person not a participant? This is not a simple matter in certain 
kinds of research. A reasonable argument may be made that systematic 
observation of natural social phenomena in public settings or public places 
should be exempted from the requirement to secure informed consent. Apart 
from procedural difficulties in securing that consent — a matter considered 
below — when persons are simply observed without any other intervention by an 
investigator, they are hardly participants in a research project. One might 
think by way of analogy to the social role of newsmen and the standards 

25-24 



applied as to whether consent must he obtained by a newsman to report on 
public events or to record them in various ways including by video-tape. 
One might consider also by way of illustration the research sponsored by 
the National Advisory Commission on Civil Disorders into the events at 
Kent State University. The Commission research staff utilized a large number 
of video-tapes, photographs, and observer accounts, including those of news- 
men, to reconstruct the tragic events at Kent State. No effort was made to 
secure consent from the participants in those events even though many could 
be uniquely identified. 

Frequently in social science research, a participant is a member of an 
organization whose behavior is examined — the object of the inquiry is the 
behavior of organizations or collectivities. While information might be 
obtained from many persons within the organization, there is reason to 
question whether their informed consent is required if consent has been 
given by organizational representatives and the information elicited pertains 
to their role within the organization. It is even possible that harm might 
result from the inquiry, e.g., that the number of positions in the organiza- 
tion might be reduced and some people loose their jobs; yet it is not the 
subject who is the participant but the organization and positions within it. 
The organization, for example, may in fact require assessment of job performance 
as a condition of employment and thereby obviate a specific requirement for 
informed consent. There are very special problems when an organization re- 
quires a given procedure to be followed that is both a method of organiza- 
tional intelligence and assessment and an input into evaluation research. 
Whether employees should be permitted to decline participation in the research 
project if the officials authorized to consent for the organization grant 
consent for the research is problematic. 

25-25 



One might view this problem in another way. Where organizational 
consent is required to undertake an inquiry, their consent is essential. 
Whether or not that consent should be informed is unclear in some federal 
regulations, but clear in others. The proposed LEAA regulations, for 
example, require organizational consent (28 CFR 22.2). A person is defined 
as "any individual, partnership, corporation, association, public or 
private organization or governmental entity, or combination thereof" and 
a private person means "any person as defined . . . other than an agency, 
or department of Federal, State, or local government, or any component 
or combination thereof." Under this proposed definition all 'individuals' 
are 'private persons' (e.g., no distinction is made between an 'individual' 
acting in a 'private' as opposed to 'official' capacity) (28 CFR 22.2: 
Commentary) . The proposed regulations require that the elements of notifica- 
tion be followed for all persons. They do however provide for an exception 
when information is to be obtained by observation or when "... such dis- 
closure would have a serious detrimental effect on subject participation or 
purpose of research that would make conduct of the program impossible" 
(28 CFR 22.27:c). 

There are several criteria that can be considered in determining whether 
the requirement to obtain informed consent may be waived or unnecessary. 
Each of them is briefly considered: 

1. The consent of persons need not be obtained when what is observed 
is ordinarily open to observation by many others in the course of daily 
life — i.e., it is public knowledge. The exemption could extend to private 
places open to the public as well as to public places. As a corrollary, 
consent need not be obtained for the observation of what is public behavior 
in public places. The question of whether private behavior in public places 

25-26 



is similarly exempt from the requirement of informed consent is more difficult 
to defend, e.g., making a record of an overheard conversation in a public 
place or during a public event with evidence of the identifying characteris- 
tics of those engaged in conversation. We shall later consider separately 
the matter of unique identifiers and the special conditions of consent 
related to them. 

2. Within hierarchical organizations, the necessity to secure informed 
consent may be restricted to the highest level of participant representing 
the organization provided that the object of the inquiry is organizational 
behavior or aggregation across an organization or organizations rather than 
the persons who are members of that organization. 

3. Special problems arise as to whether organizational consent is 
required when the object of inquiry is an organization but the information 
on the organization is secured solely by obtaining the informed consent of 
members of that organization. Should one, for example, require the. consent 
of teachers to test the learning increment of students in their classes, or 
only that of the students, when teacher as well as student performance is 
being evaluated? There would appear to be no simple answer to thst ques- 
tion, but it must be borne in mind that when there is substantial power to 
block the objectives of inquiry due solely to the power persons are given 
by virtue of their position, then their consent need not always be required, 
provided they are not coerced into participation. Put another way, informa- 
tion can be sought about individuals and organizations that is not strictly 

a personal matter, i.e., it pertains to their organizational or public roles, 
when their power to withhold consent blocks the objectives of an inquiry to 
which others grant their informed consent. 

This is but a special case of a more general problem of using social 

25-27 



power to block the objectives of legitimate scientific inquiry when those 
in lesser positions of power grant their consent. Thus when a police 
chief refuses to grant permission for interviewing police officers in the 
police department regarding police work but the officers consent to being 
interviewed about these matters when off duty, the consent of the police 
chief need not be required. A refusal from persons in positions of social 
power to grant consent should not ordinarily preclude obtaining the same 
information from others who grant informed consent. 

4. When consent is obtained to investigate social relationships or 
social settings from at least one of the participants to an event, the 
consent of all participants need not be obtained if that requirement would 
be burdensome and it is unlikely that any undue risk is occasioned by 
their failure to do so. There may be difficulties in determining when it 
is not necessary to obtain consent. Consider the following example. 
Suppose one wants to study the way teachers allocate time to various roles 
in their classroom because one is interested in how much time is spent in 
teaching and how much time in the role of principal disciplinarian. The 
observer will only sit and observe, never intervening in the process. The 
purpose of the study is fully communicated to the Board of Education which 
grants its consent. School principals are made aware that consent has 
been granted and are requested by the Board to participate in the study. 
The principals in turn inform teachers that an observer will be present in 
their classroom and have the Board's permission to be present and observe. 
Teachers in turn may introduce the observer to pupils only as someone doing 
research. Is it sufficient in this example that informed consent be ob- 
tained by the research investigator only from the Board of Education provided 
that all individual identities are protected? The problem, as one can see, 

25-28 



is very much tied to the question of confidentiality. Where confidentiality 
can be protected so that no one within the organization is privy to any 
information that uniquely identifies persons within the organization, only 
organizational consent may be required if there is legal protection against 
disclosure. 

One reason why such a rule may be reasonable is that the procedure 
itself entails no risk from the procedure to those who participate as 
data sources in the inquiry — in brief, they are doing nothing they might 
not otherwise do and in fact are free to alter their behavior in the 
presence of an observer if they so wish. If there is any risk of harm in 
such situations it arises from the disclosure of information, once ac- 
quired, or from the knowledge that is applied following the research. Now 
if there are formal contractual agreements with organizations guaranteeing 
the protection of the identifying information from all, including members 
of the organization, and there is legal protection against compulsory and 
unauthorized disclosure, the need for informed consent seems altogether 
obviated if generalizations apply to aggregates rather than individuals. 

There are difficult cases nevertheless. Consider, for example, a 
study of police behavior in police and citizen encounters where one has 
secured the consent of the police to observe their behavior. Clearly one 
cannot observe the behavior of the police without observing the behavior of 
citizens in the encounter — a common problem in studying behavior in human 
interactions or the interactions. A requirement that the consent of the 
citizen be secured before one could study the behavior of the police in 
the interaction is not only burdensome but might well endanger both the 
police and citizens under some circumstances were it necessary to secure 
the consent of the citizen before the police could intervene. Indeed, the 

25-29 



most likely result would be to foreclose that kind of research altogether 
since the police could hardly be expected to agree to allow an observer 
to observe their behavior on the condition that the observer first be 
allowed to secure the consent of the citizen before any police behavior could 
take place. This example clearly points up a complication of studying 
behavior in natural social settings where the intervention to secure an 
informed consent can itself fundamentally alter social situations and the 
risks attached to them! We shall have occasion to note later that the 
protection of confidentiality and strong sanctions for violation is critical 
in considering the matter of informed consent. In much behavioral social 
science research the only risk that exists is the risk arising from the 
failure of the society to grant legal protection for information. Thus 
in many cases the question should shift to the question of when legal protec- 
tion should be given, as by a confidentiality certificate, rather than 
whether there should be informed consent. Informed consent is crucial when 
something can happen to the person because of what the procedure of inquiry 
does directly to the participant; it seems far less critical when the only 
harm that can occur arises from the disclosure of private information — a 
problem that is largely obviated by legal protection. 

Who May Grant Consent For Intrusion Into Private Matters? 

Private matters may be those of individuals or corporate bodies. 
Individuals generally have information about their own private matters , those 
of others , and those of corporate bodies . A corporate body , similarly , 
possesses information about the private matters of individuals and its own 
affairs . Clearly at issue is what may each consent to or provide informa- 
tion about without having secured the consent of others on whom they give 

25-30 



information. Correlatively, can an investigator obtain information where 
in securing it information often is obtained that pertains to the private 
affairs of others? The principle that competent individuals have the right 
to consent to intrusion upon their private affairs poses questions of com- 
petence and the form of inquiry. Provided that an individual is competent, 
consent may be given on direct inquiry. The question of age of competence 
to grant consent can rest in a legal age of adult status, but whether social 
investigators should abide by that definition of age of consent is debatable. 
The criteria for establishing mental or emotional competence to grant 
consent are far more ambiguous. Does the consent of a mental patient to 
direct inquiry, for example, automatically satisfy criteria for protecting 
subjects? Absent competence to grant consent, is the criterion of consent 
granted by the person or persons "responsible" for the incompetent adequate? 

The question of who may grant consent is particularly troublesome when 
information about private affairs is secured by indirect inquiry (from 
others) or from the records of corporate bodies. Even where a corporate 
body has secured consent to disclose information for use by others, as 
Goldstein notes, the agreement is generally so vague or incomplete as to 
lack the basic elements of informed consent (Goldstein, 1969). A simple 
agreement that the information will be used only for research or later 
treatment, for example, lacks the basic elements of informed consent. The 
absence of specific legal prohibitions against divulging information that 
identifies individuals leads to much questionable use of files and dossiers 
of corporate bodies. 

One of the more difficult questions about consent for access to informa- 
tion on private matters arises in securing consent on the private matters of 
corporate bodies since the organization often has no clear procedures for 

25-31 



granting consent to gain access to such information. Employees, moreover, 
may purport to give consent when they lack authority to do so or they make 
disclosures inadvertently. Without written authorization for access to 
specific information on corporate bodies, the legitimacy of acquiring such 
information is highly questionable. 

An important question for Institutional Review Boards to consider is 
whether they can maintain a viable behavioral science while adhering to 
the following principles for who may grant consent: 

1. Information on the private affairs of individuals shall be obtained 
only by informed consent on direct inquiry from the individual en his or 
her private affairs. 

2. There shall be no indirect inquiry on the private affairs of 

others, or access to such information from corporate bodies when the individual 
can be identified by the investigators. 

3. Information on the private affairs of corporate bodies that identifies 
the body may be obtained only on written authorization of an individual or 
group of that body that has authority to grant that consent. 

One need not reflect long to see thai: such principles seem to fly in 
the face of social reality. Many personal or seemingly private matters 
arise in interactions that involve the private affairs of all parties in 
the interaction. Questions of the husband about the marriage relationship 
usually disclose private affairs of the wife. Questions asked of children 
about their relationships with parents frequently pry into the private 
affairs of their parents. In general, private matters are by definition 
often personal matters since they inquire into what sociologists call inter- 
personal relationships. The same holds true for the relationships of corporate 
bodies with their clients and with other bodies. Furthermore, on direct 

25-32 



inquiry few persons or agents of organizations separate their personal 
view about others from disclosing facts about others. A research procedure, 
indeed, can capitalize on the fact that informants do not make such separa- 
tions. The willingness of persons to disclose information about others 
often is used to reduce the cost of collecting information. In any case, 
what these and many other examples can illustrate is that in the course of 
social inquiry, one simply cannot avoid acquiring information that could 
bring harm to others whose consent was not obtained. Psychiatrists are 
altogether familiar with this problem in treating patients; social scientists 
are altogether familiar with it in studying most aspects of social life. 
Clearly what this points to again is that the problem arises as to how to 
protect information from disclosure when the only alternative is to foreclose 
the possibility of the inquiry altogether. Considered in yet another way, 
to what can an individual consent without risking disclosures that depend 
upon the consent of others? For whole classes of problematic aspects of 
social life that involve the study of relationships or interrelationships 
and for certain kinds of techniques such as sociometric and social network 
analysis that are based on social exchanges or relationships, it is impossible 
to investigate them without acquiring information on more than a single party 
whose consent was obtained. There is no simple answer to that question. The 
suggestion that the consent of all parties be obtained before that of a 
single party is obtained is often unworkable since in many cases the other 
parties are not known in advance. Thus one cannot study friendship networks 
except by first discovering the friendship network. Suppose that some friend- 
ship networks include participants in a form of deviant behavior, e.g., 
homosexual conduct. If one began by delineating the network and followed 
this with queries to learn what it is that formed the basis of friendship 

25-33 



only to learn then that it is a form of sex relationship, one is immediately 
privy to information on all parties to the network. What seems required for 
the study of "private" social relationships and exchanges then is adequate 
protection for disclosure of the information secured by informed consent. 

The consent process is further complicated by the question of who 
grants consent given the ways that persons become accessible for behavioral 
science investigations. The bio-medical model quite commonly assumes that 
the research subject becomes accessible for reasons other than the particular 
research inquiry. Moreover, they become accessible to investigation in 
settings that are controlled by persons who conduct the investigation. 
Typically the bio-medical model refers to clients or patients who are re- 
questing treatment of professionals who operate in offices, clinics, or 
hospitals that are subject in some measure to the investigator's control. 
Certainly all of these settings lie beyond the control of the research 
subject. The fact that many subjects become accessible for research because 
they are at the same time in some other role relationship with the investigators, 
such as patient and therapist, is also critical. Even when they become acces- 
sible because they are the clients of other professionals, it is well to 
bear in mind that the accessibility of research subjects depends upon an 
institutionally organized setting and a confraternity. Where there are 
overlapping dependencies in role relationships such as the doctor-patient 
with principal investigator-subject relationships, and where the subject is 
in relatively unfamiliar or on alien and unfamiliar territory that lies 
beyond their domain and control, and where it is further complicated by an 
active procedural intervention on the subject, it seems essential that the 
research subject be able to distinguish these separate roles and what is 

open to choice. 

25-34 



There is a second model, that of subject research in total institutions, 
where it seems that the right of subjects to informed consent is critical. 
A closed institutional setting lies beyond the capacity of subjects to 
control. Much of their activity, moreover, is constrained and coerced by 
total institutional routines (Goffiuan, 1961: ). Special attention must 
be given to insure that their participation is voluntary, not only by 
securing informed consent directly from subjects^but by insuring that the 
prior processes of securing institutional consent have had no effect upon 
subject consent. Both bio-medical and behavioral science research occurs 
within total institutional settings and special consent and confidentiality 
procedures are appropriately established for such settings. 

A third model is the one already alluded to where subjects become 
available because they are members of organizations that make them accessible 
to inquiry. Here the problems exist of securing consent from multiple 
parties, a matter considered previously. Much of the matter of consent, as 
already noted, depends upon whether it is the behavior of organizations or 
the behavior of persons that is under investigation. 

A fourth model is prototypical in social surveys and some systematic 
social observation surveys of natural social phenomena or occurrences. 
Typically, the social investigator or observer moves to the setting of the 
participants and must accommodate to their rules. The setting is almost 
entirely beyond the control of the investigator and largely subject to 
control by the behavior of the participants, particularly in private places. 
The investigator is there as a matter of privilege, if it is a private 
place. Moreover, typically there is no prior relationship and none is ex- 
pected following the completion of the research task. Contact and communca- 

* 

tlon is therefore limited exclusively to the research investigator-respondent 

25-35 



relationship. It would appear that the social power of those inquiring and 
of those of whom inquiry is made is more nearly equalized under these condi- 
tions. 

There is a fifth model, where if consent is required, an abbreviated 
form may be all that is necessary. This is typically represented by the 
phone or mail survey, assuming no method of data collection that provides 
unique identification is employed. With the phone survey, the investigator 
is quite limited in both verifying information and controlling the situation. 
Lacking any prior role relationship, having only a short time and tenuous 
grounds for establishing one, and lacking most criteria for establishing 
identity, an abbreviated form of consent often not only is necessary if 
the survey is to proceed but also in keeping with the balance of social 
power in the subject-research relationship. There perhaps is no condition 
under which it is easier for a subject to refuse access, refuse to respond 
and withdraw from participation than by hanging up the telephone. 

Generalizing across these models, one might conclude that Institutional 
Review Boards should pay particular attention to: (1) whether the investigator- 
subject relationship grows out of a prior or continuing relationship; (2) the 
balance of power between subject and investigator ranging from subject 
dependent to investigator dependent; (3) whether the research setting is 
subject to control by either of the parties to the inquiry or by other parties 
who may create an imbalance in investigator-subject power; and (A) whether 
the procedure of investigation alters the condition of the subject. Rules 
of the following sort might guide decisions. Where subject power is low 
relative to investigator power, there should be considerably more attention 
to their effects upon free choice by subjects. Correlatively, where 
investigator power is low relative to subject power, the requirement of 

25-36 



informed consent may be waived or abbreviated forms accepted. 

Who May Secure Consent . Formal rules for certifying human subject 
research typically do not confront the question of who is qualified to 
secure consent. Generally the qualifications of the principal investigators 
are taken as the criteria for approving the solicitation of consent. A 
similar situation tends to prevail for Institutional Review Boards where the 
reputation of field staffs, survey organizations, and other specialists in 
eliciting information is taken as the criterion for approving the elicita- 
tion of consent. Typically the social organization of research and its growth 
in scale has led to the training and development of specialists in eliciting 
information — survey interviewers, for example, or trained social observers. 
While matters of pretige and reputation are guides, they are far from 
fallible in insuring that required procedures will be followed. 

It is no simple matter to control the activities of persons whose task 
it is to elicit information. Generally in social research there are part- 
time as well as full-time white-collar employees, who have been trained in 
a particular eliciting procedure. Often student volunteers or assistants 
in training are members of the research team. When there are professional 
specialists, such as clinicians, procedures for certification exist; despite 
failings, certification provides reasonable grounds for deciding competence 
and trust in a fiducial capacity. 

Yet it remains true that much social research is conducted by a spatially 
dispersed set of employees who are not subject to direct supervision and 
often are not under the direct control of the principal investigator. Their 
competence will vary considerably. This makes the fiduciary relationship 
between investigator and task specialist and of the latter with the subject 

25-37 



precarious in two ways. The subject is vulnerable to incompetence and 
unauthorized misuse of information as well as fraud in failing to secure 
informed consent. The principal investigator is vulnerable to the employee's 
misuse of procedure and information, thereby increasing both his legal 
liability and the integrity of the research process. Lacking legal protec- 
tion from employee misuse of position and information and strong sanctions 
against violation of the fiducial relationship, it is difficult to guarantee 
and control confidentiality. Procedural competence can at least be partially 
controlled if the principal investigator either monitors or seeks ways of 
determining employee competence to undertake the task of eliciting informa- 
tion. Yet in all employing organizations there are failures, and a research 
organization is no more invulnerable to such failures than is any other 
organization. It is in fact quite remarkable that misuse of confidentiality 
and poor practice generally have not crossed the threshold to be regarded 
as problematic in behavioral science inquiry. At the same time it must be 
said that very little attention has been given to these matters. Where 
confidentiality is essential to the design of an investigation, principal 
investigators and Institutional Review Eoards should seek information on 
the competence of those who elicit information. 

The corporate nature of research and the size and scope of the inquiry 
enlarges the circle of persons who elicit information. It is essential 
therefore that attention be given to what forms of organizational control 
are exerted over employees, what sanctions are available for misuse of 
authority, and what procedures are followed to insure that they are properly 
trained in the particular eliciting procedure. Institutional Review Boards 
may require assurances that such training procedures are actually carried 
out. It is one among a number of matters that should be called for in 

25-38 



routine monitoring of behavioral research. 

In behavioral science research procedures, generally those who elicit 
consent are those who terminate the research relationship. It is important 
that they not only be sensitive to the right of subjects to terminate the 
relationship at any time but that they fully inform them of any changes in 
conditions related to the consent. The regulations governing confidentiality 
certificates obligate investigators to inform subjects when a certificate 
is terminated (42 CFR 2a. 4:8 and 2a. 8). Institutional Review Boards should 
request assurance and evidence that subjects are advised appropriately when 
a confidentiality certificate is withdrawn. 

Elements of Notice for Informed Consent 

The Code of Federal Regulations for HEW sponsored research (45 CFR 46) 

sets forth a number of basic elements of information for which notice must 

be given in eliciting an informed consent. Each of these is considered in 

terms of its special implications for behavioral science inquiry. 

(1) A fair explanation of the procedures to be followed and t heir 
purposes , including identification of_ any procedures which are 
experimental . (45 CFR 46.3:c-l) 

Although it seems appropriate in securing informed consent to explain 

the procedures that are to be followed in eliciting information from persons, 

it is generally correct to say that almost all of the procedures for eliciting 

information have little effect on persons or organizations qua procedures. 

Thus the procedure of interviewing that consists of asking questions and 

getting answers has little if any effect on persons; indeed the elements 

of the procedure occur in everyday life. Where the procedure has some effect 

on subjects because of special experimental interventions or stimuli or for 

other reasons, some explanation seems required. But routine eliciting 

procedures would appear to require little by way of explanation. 

25-39 



It is unclear whether more is intended by saying that the purpose of 
any procedure must be described other than its intended procedural use. 
If the purpose is to require some explanation as to the kind of informa- 
tion that is to be elicited or task to be performed, matters of communicating 
the substance of the inquiry and the goals of the investigation need to be 
specified. Generally in behavioral science research, any detailed explana- 
tion or description of these matters would prove burdensome and might have 
a substantial effect on the rate of consent. Perhaps the best rule to 
follow is that subjects should be advised on matters of substance if the 
procedure will elicit information on confidential matters, matters that 
are ordinarily anxiety provoking, or ones that a minority of respondents 
find objectionable. For many social science investigations, however, a simple 
statement of what procedure is to be used — a poll, a survey, an interview, 
watching or observing, filling out a questionnaire, completing a form — 
will be sufficient. 

Earlier we made note of the fact that some social experiments, surveys, 
and evaluation studies require a cloaking of purposes or measures if they are 
to provide valid and reliable information. Whether persons must be advised 
that there are some indirect measures in the study about which feedback will 
be given at the close of the procedure or whether other modes of communicating 
must be followed is moot. Where no particular harm will befall a person as 
a consequence of using deceptive or indirect measures, it would seem un- 
necessary to require that it be communicated in securing informed consent. 
The full implication of this position bears scrutiny, however. 

It should be clear that social scientists not infrequently seek to 
acquire information that persons would not provide if directly and explicitly 
informed of the intent by the investigator. To tell a parent that one is 

25-40 



interested in learning whether they are authoritarians or democrats, punitive 
or permissive, racist, liberal or conservative, and sexist or egalitarian in 
their child-rearing practices is not only unwise if one is interested in 
valid and reliable measures but to risk securing consent for studies that 
may have enormous social benefits. 

What seems critical in informing persons or organizations about the 
procedure to be followed is that they be informed about the procedures for 
analyzing and reporting upon the information that is to be gathered. 
Generally, social scientists are interested in analyzing and reporting data 
for large aggregates in which it is not possible to identify individuals. 
It should be sufficient in many instances to simply inform the person whose 
consent is being sought that one is doing a statistical study where it will 
not be possible to identify them with any of the information that becomes 
public knowledge. Analyzing and reporting information for social aggregates 
or collectivities is an important way of preventing disclosure of uniquely 
identi f iable information . When, for any reason, a procedure of analysis or 
reporting data is to be followed where it may be possible to make inferences 
about individual identities, persons should be apprised that is the procedure 
to be followed. A statement, for instance, that the information is to be 
presented as a case study and whether or how identity is to be cloaked in 
reporting is a minimum of what must be communicated in such instances. 

There are types of social research where it is especially difficult 
to describe the procedures to be followed or where their full disclosure 
imposes limits on the technique. Three of these are singled out for special 
attention: exploratory studies, participant observation, and systematic 
social observation. 

Exploratory Research . It is particularly difficult to satisfy the 

25-41 



criterion of informing about procedures and goals of inquiry in research 
that is essentially exploratory in nature and where no specific procedure 
is to be followed, a situation that is not uncommon in behavioral science 
research. This is often the case in solo- field or participant-observer 
research and in case studies. The investigator may utilize a host of 
exploratory procedures including observation and interviews, group dis- 
cussions, life history techniques, personal documents or records, participa- 
tion in events, and even assuming social participant roles in everyday 
life. Questions in exploratory surveys more often seek an open-ended 
rather than a closed or fixed response. The use of probes in exploring 
the information that will enlighten, inform, or explain do not lend themselves 
to predictable types of information that will be acquired. Such techniques 
may quite often obtain considerable material that is extraneous to the 
problems under exploration and may be matters which the subject would not 
otherwise disclose. Yet the acquisition of new knowledge must permit 
reasonable exploration. While a simple statement that the investigator 
wants to explore certain topics or matters will not suffice to inform the 
subjects of participants from whom information is sought^ their permission 
to, quite frankly, "explore" or "look at in-depth" a number of matters 
should be allowed if an Institutional Review Board and Peer Review Committees 
consider the problem significant, if alternative ways of investigating 
the matter are not as promising, and if the investigator can be trusted 
to fulfill at least those conditions of notice which are applicable (such 
as allowing persons to refuse answers or withdraw from participation) . 

Participant Observation poses special problems of satisfying the criterion 
of "informed consent" since the observer utilizes ordinary social roles as 
well as that of investigator to acquire information or to legitimate the 

25-42 



observer role. Apart from questions of deception that arise when the dual 
nature of participation and observation is not explicitly stated, participa- 
tion itself may be utilized to gain an advantage before any information is 
gathered. Thus participation may serve to develop a trust relationship 
which then might be exploited by seeking their consent to serve in a re- 
search role. The participant observer role, as previously noted, poses 
special problems of consent generated by the intersection of several dif- 
ferent roles in the same person. There seems to be more rather than less 
need to inform about the research role in participant-observer as compared 
with observer studies since the role of observer is easily confused with 
the role of participant. 

Systematic Social Observation is constrained, as previously noted, 
by difficulties in determining whose consent is required. There are, 
however, important limitations on securing consent from individuals who 
are being observed, limitations imposed by practical considerations of 
implementation, timing, and unpredictability about precisely who is to be 
observed in particular settings. It might not only be impractical to 
secure the consent of all persons at a public meeting but certainly it 
would be difficult to single out in advance all persons who might be 
active participants on which the observation would concentrate. At times 
one can follow the procedure of announcing that one is present as an 
observer or one can secure the consent of persons in authority in the 
setting, but where these are not feasible there are few substitutes for 
securing the consent of those under observation. The extent to which one 
will forego the requirement of informed consent in systematic observation 
always will depend, of course, on an assessment of the risks involved in 
observation and protection afforded against harm from disclosure. 

25-43 



(2) a disclosure of any appropriate alternative procedures 

that might be advantageous for the subject . (45 CFR 46.3:c-4) 

This requirement of notice derives from a bio-medical model of research 
where the role of investigator intersects with those of other roles such 
as that of medical specialist who has diagnostic or treatment options to 
that of the research procedure. Alternatives also exist when there is more 
than one form of diagnosis or treatment, etc. When the role of experimenter 
is merged with that of impartial investigator, alternative forms of experi- 
mental procedures may be possible. Alternative procedures may also exist 
in studies that involve social intervention and evaluation of it or in 
participant observation. In such cases there is likewise a merger or inter- 
section of other roles with that of investigator. 

Most of the time, however, the question of advising about alternative 
procedures that might be advantageous to the subject is inapplicable in 
behavioral science research because the nature of any anticipated benefits 
does not involve a calculus of alternative procedures. Ordinarily, behavioral 
science inquiry does not promise benefits to research participants as a 
consequence of participation. Th.us it is not germane to define alternative 
procedures that might be advantageous to subjects. Indeed, there are strong 
prohibitions against using procedures that may be more advantageous to 
subjects in behavioral science research on the grounds that such advantages 
may bias the results of the inquiry. That of course is an empirical question. 

(3) an offer to answer any inquiries concerning the procedures . 
TJ5 CFR 46.2:c-5) 

Quite obviously, any person in direct contact with a research subject 

or participant should answer questions about any of the elements that are 

stipulated as the elements in notice. There are certain other kinds of 

information, nevertheless, that a social investigator often supplies by 

25-44 



way of notice and about which there must be direct answer if direct inquiry 
is made. These include the following: 

1 . The person who seeks to ' elicit information or make any other 
procedure operative must provide a_ unique identification of self on direct 
inquiry . Ordinarily this should be done as a part of the procedure in securing 
informed consent, k. subject has a_ right to know to whom information is 

given or who is performing research p rocedures ; this might well be an 
element of notice in informed consent. The complaint form and the warrant 
or testimonial discussed later will provide documentation of persons who 
secure informed consent and undertake any procedures directly on a person. 

2. Requested information on auspices and sources of financial support 
s hould be answered on direct inquiry . Where promise is made to provide that 
information in the event that a particular employee is not familiar with the. 
information requested, evidence must be provided that it was made and sup- 
plied. Normally, however, every employee who interacts with subjects should 
have a reasonable amount of information on auspices and sponsorship and 
principal investigators should be held responsible for informing them. 

3. Any request for information about unique i dentifiers whether by 
means of data collection or other modes of identification should be supplied . 
Questions about modes of observation and recording and whether they carry 
unique identifiers must be answered by an employee when questions are asked. 

4 . Any request for information about mechanical aids to information 
recording should be answered , including information about how that or any 
other kind of information is to be protected , for how long , etc . 

We have had occasion to note that when procedures cloak some of the 
objectives of the inquiry, investigators may be excused from making those 
explicit if the result is to seriously damage the validity and reliability 

25-45 



of information and no particular harm attends most subjects who are involved 

in the procedure. Despite this exemption from affirmative action, it 

appears reasonable to stipulate that should any person explicitly inquire 

whether there is deception in any form, one must not only offer to answer, 

but to answer truthfully, so as not to deceive on direct inquiry. 

In brief, remembering that employees are members of an organization, 

all employees who have roles for eliciting informed consent or performing 

any research procedures directly en persons should be given sufficient 

information so that they may answer directly the questions stipulated 

above and any others deemed essential to informed consent. 

(4) an instruction that the perso n is free to withdraw consent 
and to discontinue participation in the project cr activity 
at any time without prejudice to the subject . (45 CFR. 46.3: 
c-6). 

The promise that the subject is free to withdraw consent and to 
discontinue participation at any time poses special problems for both 
subjects and investigators. One question that can be raised is under what 
conditions is that promise compromised by the consent procedure or the 
methods of inquiry undertaken by the investigator. 

First, whenever inducements have been offered to subjects to reward 
them for their participation and they are so advised at the time to consent, 
the inducements, particularly money, may affect any person's willingness 
to withdraw. It should be apparent that investigators should not offer 
inducements that are contingent upon completion of a particular task unless 
it is a matter of formal contract. Otherwise they can easily compromise 
a subject's wish to withdraw. 

Second, any promise of withdrawal is operative only at the level at 
which it is communicated. When consent is obtained for organizational 

25-46 



personnel to participate in an investigation, there should be explicit 
agreement about whether such persons may voluntarily refuse to provide 
information or withdraw from participation. When it is agreed they may 
do so, it should be explicitly communicated to each participant. To 
illustrate, if a police command agrees that an observer may ride with the 
police in his command to observe their behavior and that they have no right 
to refuse to cooperate with the observer, it is his rather than the 
observer's obligation to communicate that to officers and the observer has 
no right or obligation to advise the officer that he has a right to refuse 
or withdraw. 

Third, a promise of a right to withdraw or refuse to participate in 
cooperating with some aspects of the procedure may lack force where there 
are strong pressures from other sources to continue participation as 
noted in the discussion of inducements. Care should be taken to minimize 
the force of such pressures when they cannot be eliminated altogether by 
virtue of the fact they are natural social phenomena. 

Fourth, the promise of refusal or withdrawing may be an inadequate 
protection with some procedures and neither the person who elicits informa- 
tion or controls participation nor the person who is advised of the right 
to withdraw may be aware of the subtle ways that the decision to refuse or 
withdraw is brought to a threshold of consciousness and therefore raises 
the matter to a decision level of refusal or withdrawal. Where behavior or 
responses to stimuli, including verbal stimuli, are sequenced, much informa- 
tion may have been given that the subject may wish had not been given after 
the threshold is reached. This is not an uncommon result when interrogation 
is followed in intelligence gathering procedures; it may also occur in 
research techniques of questioning. A question arises whether persons who 

25-47 



consent to participate should have such control over the information provided 

that they may demand that information already given now be withdrawn. Thus 

subject refusal or withdrawing may be inadequate when the person wishes to 

withdraw matters that are already a matter of record. 

Even were one to grant some right to expunge the record, there are 

real limits on the capacity to do so. One can expunge a written record, 

return a questionnaire or test that was completed, or in other ways destroy 

matters of record, including the record that consent was given! Whether 

such an option should apply to a right to expunge the record of consent is 

problematic. Yet, there clearly are conditions under which a person might 

wish to make that request such as when that record of consent or refusal 

to consent is incriminating or damaging to the participant. Limits to 

expungement arise, moreover, from the fact that one cannot obliterate the 

memory or experience of others. The most that could be required in such 

instances is an explicit prohibition against the use of such materials in 

any form or for any purpose. That is not, however, an enforceable rule where 

memory is at stake absent explicit evidence of use. 

(5) Any institution promising to place any subject at risk is 
obligated to obtain and document legally effective informed 
consent . No such informed consent , oral or written , obtained 
under an assurance provided pursuant to this part shall 
include any exculpatory language through which the subject 
is made to waive , or to appear to waive , any of his ' legal 
rights , including any release of the institution or its 
agents f rom liability or negligence . 

One assumes that statements made to subjects holding that any assistance 

given to the subject cannot be regarded as an acknowledgement of liability 

or negligence by the institution or any of its agents are not exculpatory 

since they do not represent a disclaimer of responsibility for conduct but 

pertain to evidentiary questions at law. 



25-48 



It is unfortunate that the traditions of tort liability in American 
law place such heavy emphasis on fault and negligence and fail to lay 
stress upon affirmative duties or responsibilities. Where human subject 
research exposes subjects to risk and there is reason to believe harm 
has occurred, tort doctrines might better stress affirmative responsi- 
bilities — the moral and legal obligation to give help. There are some 
exceptions in American law of affirmative doctrines, such as the Good 
Samaritan laws to protect heroic and other civic actions from tort 
liability. In human subject research, special consideration might be given 
to developing some exemptions from tort liability where the desirability 
of affirmative actions outweighs protection provided by tort liability. 

(6) a description of any attendant discomforts and risks 
reasonably to be expected . (45 CFR 46.3:c-2) 

From the perspective of behavioral science, this requirement of notice 
is unduly restricted by the bio-medical model of Human Subject research 
unless one construes the reference of "attendant discomforts and risks" to 
include any discomforts or risks that follow both directly and indirectly 
from participation in the research. It bears reminder that in behavioral 
science inquiry major risk of harm attends primarily from the disclosure 
of private matters rather than from specific procedures for eliciting informa- 
tion or the performance of tasks during the eliciting procedures. We shall 
assume that in behavioral science research the broader construction applies 
and merits close attention from Institutional Review Boards. 

Considerable difficulty attends the operationalization or interpreta- 
tion of the constraint "reasonably to be expected." Is that criterion to be 
applied on the basis of expectations for a population of all possible sub- 
jects? For a particular subject whose consent is being secured? Or, for the 
population at risk in the given research study — a population whose dimensions 

25-49 



are only generally known, e.g., a random sample of the U.S. population? 
Is one obligated to assess separately risks for subclasses of a population — 
those identified by race, age and sex, for example? Or does one choose to 
adopt the risk in using a given procedure — survey research, for example? 

To have an exact probability for a population "at risk" is unlikely 
not only because it is difficult to obtain such probabilities but also because 
such information is at most available for some related population and one 
would have to assume that risk applied. Moreover, knowing the probability 
does not provide a decision rule for an investigator or an Institutional 
Review Board. Even a rule that the benefits must exceed the risks is un- 
satisfactory in itself, not only because as already noted such ratios can- 
not be applied to all behavioral science research but because both the 
level of the risk and the ratio of risks to benefits are at issue. There 
is a strong likelihood, in fact, that different Institutional Review Boards 
will adopt different decision rules both for a given level of risk and for 
cost/benefit ratio, thereby leading to inequities among investigators. The 
problem, of course, of not unique to scientific research since it is 
characteristic of all discretionary decision-making in systems where equity 
is at stake. 

There are also no clear guidelines in the regulations for the choice 
of a base to assess risk. Social scientists would ordinarily think in terms 
of probabilities of harm for a given population that is "at risk" or of an 
actuarial base. Yet if choice of risk and base population are permitted, 
one might opt for the risk element and the base that give the lowest risk. 
To illustrate, there is a fairly low probability that the survey method ever 
leads to employee disclosure of confidential material; enough evidence is 
available to permit one to conclude that the use of the survey method cannot 

25-50 



reasonably be expected to produce unauthorized disclosure. Based on the 
risk of using the survey method in all studies, one would conclude that 
in the ordinary use of the survey, informed consent is not required. 
Similarly the risk of compulsory disclosure from the use of subpoenas is 
so low for all studies or even "sensitive ones" as to "obviate" the need of. 
informed consent. If, however, the relevant criteria are the population 
at risk to a particular study where the population already is at risk for 
harm from past behavior, e.g., a population which is asked to report 
violations of law during the past year, the problem is not easily resolved 
as to whether their informed consent is required. On the one hand one 
might conclude that the risk of disclosure has been very low in such 
studies, but on the other hand the potential harm is not inconsiderable 
in a given case. 

There is, of course, the additional matter that a guarantee of confi- 
dentiality may be necessary to secure consent from the members of a popula- 
tion that perceives its risk to be high, e.g., criminal offenders or drug 
users. The relevant criterion here shifts to subject perceptions of risk 
of harm rather than to an actual assessment of risk from harm. Where confi- 
dentiality is at stake, one perhaps must recognize that no simple rule of 
whether or not informed consent is mandatory is easily formulated. But, in 
any case, adequate legal protection has the capacity to reduce many social 
risks. It can be maintained, nevertheless, that in exchange for legal 
protection one is compelled to follow rules of informed consent, a require- 
ment that proposed regulations follow (43 CFR 2a. 4 and 28 CFR 22.26). 

The problem of risk assessment is, in any case, closely linked with the 
necessity to guarantee the unique identity of persons and information from 
disclosure if subject cooperation is to be secured. Where the procedure 

25-51 



guarantees anonymity in the form of data collection as in the anonymous 
completion of questionnaires, the risk is close to zero. Yet the anonymity 
procedure cannot be instituted without consent to participate, though the 
extent to which consent must be informed to secure anonymous participation 
is moot. 

With some exceptions, to be discussed later, behavioral science 
research when gathering information that has unique identifiers has no 
interest in reporting information with unique identifiers. This follows 
from the fact that most behavioral scientists have an interest in ag- 
gregative levels of information. It is most easy to disaggregate data 
gathered from persons, families, and households, and most difficult to 
report it for certain kinds of corporate units such as multinational 
corporations. Much depends, however, on the number of units in a defined 
statistical universe and whether cr not that universe is identified. Thus 
one could do some disaggregation in reporting analyses for 200 teachers 
but if they are all identified as coming from the same school, the level 
of disaggregation possible before unique identification occurs is much 
less than if the 200 teachers came from all schools in the United States. 
It would be relatively easy, moreover, to identify the male physical educa- 
tion teachers in a single school but more difficult if the sample were from 
200 schools. Yet seme possibility would exist even at that level of dis- 
aggregation for the 200 national schools. Disaggregation must follow rules 
of its own to prevent disclosure. 

There are situations, however, where reasonable expectations are that 
considerable risk may attend the securing of information because one is 
unable to protect the data against disclosure should one be compelled to 
do so. That condition arises whenever the State, at law or otherwise, 

25-52 



compels disclosure. At law within the United States, absent statutory 
protection on disclosure, one may be compelled to disclose in response to 
subpoena, for example. The risk of coerced disclosure is considerably 
greater in comparative national research, however, since the capacity of 
foreign nationals to protect their data is generally without legal guarantee. 
The risk may be considerable in some societies for kinds of data that 
ordinarily pose little or no risk in American society. Whenever research 
is to be undertaken in foreign countries, Institutional Review Boards must 
give close attention to the capacity of investigators to protect their 
information even when informed consent is elicited, lest one become an 
agent of harm. 

The necessity of notice, however, hinges in part upon the definition 
of "subject at risk" already discussed. The HEW requirements in the Code 
of Federal Regulations stipulate that a subject is at risk when he " . . . 
may be exposed to the possibility of injury, including physical, psychological, 
or social injury, as a consequence of participation as a subject in any re- 
search, development, or related activities which departs from the applica- 
tion of those established and accepted methods necessary to meet his needs, 
or which increases the ordinary risks of daily life, including the recognized 
risks inherent in a chosen occupation or field of service" (46 CFR 46.3-b). 
It is hard to say that most behavioral science research in any way is neces- 
sary to meet the needs of most subjects or that there is no possibility 
research does not increase the ordinary risks of daily life or those inherent 
in an occupation or career. Possibilities always exist. I suppose that the 
possibility of psychological harm always exists and an operable question is 
whether a social research procedure has any more risk of psychological or 
social harm than the ordinary risks of social life. Think for a moment 

25-53 



whether most behavioral science studies of pupils in schools are any more 
likely to do psychological or social harm than that done each day to pupils 
in many schools. My impression is that research suggests many teachers do 
more harm to students than do most investigators. Should one conclude then 
that because research in schools ordinarily does no more harm than that 
done every day by their teachers, one is justified in approving a proposal? 
What criteria are to be applied? Or consider another example: may the 
police ordinarily not do more harm to citizens than observers of police and 
citizen transactions? 

These examples are not offered to suggest that the risks of social 
science research do no more harm than the risks of everyday life or, if 
indeed that were true, that one should conclude that the criterion creates 
a tolerable level of risk in the society. They are intended rather to 
show that we know very little about the nature of risks in everyday life 
and that to know more is in itself an empirical question that would involve 
research on human persons and their organizations. There is danger that 
Institutional Review Boards will "create" risks that have little if any 
empirical foundation. The substitution of "informed guesses" is hardly a 
satisfactory solution to the problem, particularly in the assessment of 
risk. There is ordinarily a considerable range to subjective probabilities 
for any phenomenon. An Institutional Review Board is hardly a large enough 
sample to create even reliable estimates of subjective probabilities. In 
any case the relationship between subjective and objective probabilities 
can be positive or negative and they are often far from perfectly correlated. 

The Concept: of Social Harm . It likewise is far from clear what is 

intended in defining the concept of "social harm" since it is not defined 

beyond the conception of "subject at risk." There is some implication 

that again what is intended flows from the elementary Human Subject model. 

25-54 



Social harm In the restricted sense would refer to the social consequences 
for a subject. Such harm might range from a temporary experience of anxiety 
or forms of social embarrassment to far more serious consequences if private 
matters become public knowledge or are disclosed to persons who may wield 
social power over individuals. This can include the imposition of penal 
sanctions, loss of employment, social isolation or ostracism, and divorce, 
to mention but a few possible consequences that befall some persons when some 
private matters are privy to others. We have repeatedly noted that social 
harm in behavioral science research most usually would come about as a 
consequence of these latter sources of social harm, i.e., private matters 
become privy to others who then do harm. Investigators and their methods 
are not ordinarily a source, of serious harm to individuals apart from harm 
through disclosure. 

There is another type of social harm, however — that which may befall 
corporate actors or collectivities when their behavior becomes public 
knowledge. A few illustrations may suffice to make the point. Disclosure 
of the financial condition of financial institutions might lead to a "run- 
on-the-bank" ; disclosure of an impending stock transaction within an organiza- 
tion might lead to the illegal act of "insider trading" (it is assumed the 
principal investigators would not become "inside-traders"!). The disclosure 
that a particular employer discriminates against minority employees in 
employment could lead to legal actions against the firm. These are all 
instances where the disclosure of information that an investigator may ac- 
quire may do harm to corporate actors. If that information was acquired 
with a promise of trust, as is often the case, the investigator becomes 
an agent of social harm in this broader sense. 

It is inevitable, however, that some forms of social research do harm 

25-55 



when results are published — literally made public. Investigators cannot prom- 
ise that their inquiry will reach a predetermined conclusion and indeed, given 
the nature of their fiduciary responsibility as scientists, they cannot offer 
such promises. Results do not usually intend harm, but they may bring harm 
to corporate actors and their individual members. Where evaluation research 
is undertaken, as already noted, both social harm in the restricted sense 
of harm to persons and harm to corporate acLors may occur with the disclosure 
of the results from a given inquiry. Evaluation research often requires 
at least limited disclosure of identifying characteristics for the corporate 
actor. 

An important and major ethical dilemma is created for behavioral 
scientists when they enter a research relationship and extend a promise 
of a guarantee, including a legal guarantee, of confidentiality. Any 
promise of confidence prior to the disclosure of what must be held in 
confidence can become a source of a moral dilemma. Disclosures in confi- 
dence that acknowledge grievous social harm raise the question of whether 
an investigator is obligated to disclose the harm despite the promise of 
confidentiality. This dilemma is commonly faced by professionals in con- 
fessional or counseling roles. In general the norms that apply to such 
roles would appear to apply as well to the investigator's promise of 
confidence. Yet it seems unethical to extend such a promise if there are 
circumstances under which one cannot reasonably control unauthorized dis- 
closure, as when legal protection against compulsory disclosure is absent. 
For many types of private matters, approval perhaps should not be given when 
disclosure would bring substantial social harm — benefits aside — and the 
investigator has no formal legal sanctions or protection againnt disclosure. 

One other matter about social harm should be clarified. It is appropriate 

25-56 



for Institutional Review Boards to weigh the matter of harm both absolutely 
and relatively. Harm is weighed absolutely when there is no reference to 
its relationship to potential benefits. Certain kinds of research and 
certain research procedures may be ruled out on moral or legal grounds, 
e.g., wire-tapping or electronic eavesdropping, with no reference to poten- 
tial benefits. Most of the time, a calculus of cost-benefit is applied to 
determine whether a project may be approved. Generally if potential 
benefits outweigh social harm or costs, there are reasonable grounds for 
granting approval by this criterion of notice. 

Yet there are types of behavioral science research where a harm/benefits 
ratio is inappropriate. The harm/benefit ratio is often inappropriate in 
the study of corporate actors, as a consideration of examples may make 
apparent. First, since in much evaluatio n research or in quasi-social 
experiments the outcome j-s. not predictable , neither the social harm nor the 
social benefits to corporate actors can be calculated in advance of the 
actual investigation . Moreover, as already noted, an investigator cannot 
promise benefits from the results of the inquiry, though if some form of 
compensation is given by way of inducement, in a trivial sense, that might 
be thought of as a benefit. Second, in yet other cases , what is social harm 
may be simultaneously social benefit . A conclusion that a substantial 
proportion of banks have high risk investments can bring harm to these banks 
by bringing on an investigation of all banks during the course of which their 
condition is discovered and sanctions applied. At the same time, the dis- 
closure may lead to increased control of the banking industry in the public 
interest — a rather clear social benefit. It should be apparent that this 
instance is rather different from the oft cited bio-medical example where 
one must first do harm to cause wellness, or to say that the first action 

25-57 



is not harm since its intent is wellness. Social scientists may well have 
similar examples but in the type case just presented, the same information 
causes both corporate harm and corporate benefits, albeit it to different 
corporate actors. It follows, of course, that it can be simultaneous for 
the same corporate actor . and its members. 

Mention already has been made of the need to protect persons and 
corporate bodies from the disclosure of private matters whether or not 
there have been promises of confidentiality. There is both a legal 
obligation to maintain such confidence when there is a prior fiduciary 
relationship and a moral obligation to do so when intruding upon the privacy 
of others. 

The matter of protecting the integrity of corporate bodies is one that 
is particularly troublesome for behavioral scientists. On the whole, less 
attention is given to preserving the anonymity of private matters of corporate 
bodies, yet the basis for doing so is not altogether clear. There is little 
evidence that the socially harmful consequences of such disclosure are examined, 
though in some kinds of research the investigator may actually "intend" 
harm, as research undertaken in the spirit of muckraking sociology or social 
criticism (Marx, 1973). Social harm may flow also from the design of much 
evaluation or action research where the disclosure of identity is built into 
the study design. 

Risks of damage or harm exist as well for corporate bodies that are 
the sponsors of behavioral science investigation. There is ample evidence 
of the political risks occasioned by scientific research (Shils, 1956) and 
behavioral science investigation (Sjoberg, 1967). Behavioral scientists and 
their sponsors also assume political risks in competing with journalists 
(Horowitz and Rainwater, 1970), lawyers, and other organized modes of inquiry 

25-58 



as they challenge more traditional and established modes of inquiry with 

3 
claims of "scientific truth." Congressional investigations of private 

foundation funding and of grants from public agencies for research into 
controversial social issues and their ethical standards in research on 
human subjects impose political risks and governmental control over inquiry. 
On the whole, behavioral scientists have been given to view these investiga- 
tions as attacks or threats to academic freedom and free inquiry. They 
are less commonly viewed as risks and moral dilemmas for such organizations, 
which they often are as well. The moral dilemma of university sponsors 
such as that of Harvard University faced with a broad mandate to protect 
students, academic freedom, and the reputation of the. university in the 
psilocybin research of Leary and Alpert (Benson and Smith, 1967) is given 
much less attention. Yet in that case, as in many others, research sponsors 
are moved to institute controls over investigation as a resolution to 
political and moral dilemmas. The moral imperatives of protection with 
their attendant risks become a central focus of any organized effort to 
control bio-medical and behavioral science inquiry. 

The question of how much social harm may result from a particular 
inquiry is often closely linked to whether or not an investigator may 
forestall potential harm or take steps to protect from social harm. We 
shall examine below some of the matters that raise problems of protection, 
particularly those related to unique identifiers and the public disclosure 
of matters that cause harmful reactions. At the same time we shall briefly 
consider the matter of protection from harm, though that is treated more 
extensively in the third section on confidentiality. 

Unique Identifiers . A unique identifier is any information that will 



25-59 



permit someone other than the actor to whom the identification applies to 
identify that actor, whether person, corporate, or collective. When any 
other information can be attached to a unique identifier by ordinary 
evidence, a disclosure problem exists. 

Unique identifiers will vary in terms of the evidence they provide for 
exact identification according to rules of evidence and inference. Some 
identifiers have a high degree of precision, e.g., fingerprints or voice- 
prints. Photographs are somewhat less precise means as are signatures but 
their evidentiary value is substantial. Other identifiers are still less 
exact such as names and addresses. Still others require more inference 
from the evidence such as the race, age, and sex of a person at a given 
address. It follows that the more exact or unique the identifier by 
evidentiary rules and the less inference that is necessary in making a 
unique identification, the more protection that should be provided if harm 
may result from disclosure. 

The unique identification problem in research must also be viewed in 
terms of potential processes of disclosure: how the unique identification 
is made to bring about the disclosure. We shall not review all such ways 
but note that all ways relate to how access to unique identification and 
other information is obtained and how one becomes accessible to physical 
and testimonial evidence. Both present substantial problems for behavioral 
science research. 

Access to Physical Evidence . Clearly access to exact identifiers such 
as voice-recordings, video-tapes, photographs, and fingerprints pose very 
special problems for social research. Such unique identifiers pose special 
questions of whether they are necessary to the inquiry, what protection is 
provided to access, and how long and how such records are retained. Not only 

25-60 



should considerable precaution and security attend their acquisition and 
retention if they contain potentially harmful information, but some provi- 
sion must be made concerning their retention and eventual destruction as forms 
of evidence. Destruction should be guaranteed where applicable and under 
some circumstances Institutional Review Boards should require stipulation 
of these plans. Destruction of exact identifiers should wholly be 
provided for at the conclusion of research, except under the most extra- 
ordinary and compelling circumstances for their retention in subsequent 
research. The earlier destruction can be feasibly undertaken, the more 
security provided. 

It should be apparent to all involved in research that absent legal 
protection for unique identifiers and the other information related to them, 
they constitute damaging forms of evidence when there is potentially harm- 
ful information. It should also be apparent that it is far more difficult 
to erradicate testimonial than physical evidence. These are compelling 
considerations where serious damage may result from disclosure of informa- 
tion with unique identification. 

Ac cess to Settings . When physical or oral evidence is obtained, it 
must occur in social settings. Social settings vary considerably in 
their access to other than authorized research persons. The same holds for 
access to processing and storage, once information is acquired. Private 
places are less accessible to both authorized and unauthorized intrusion, 
for example, than are public places. Vulnerability, therefore, is greater 
in systematic social observation in natural social settings than in contrived 
ones in private places. Where potentially damaging information is obtained, 
there must be reasonable means of protection against access during the data 
acquisition, processing, and storage phases. Above all, Institutional 

25-61 



Review Boards should be mindful of the fact that access during acquisition 
is often the most vulnerable of social settings in the research process, at 
least from an evidentiary perspective. We would remind again that foreign 
settings are generally more vulnerable than domestic ones, that public 
places more so than private places, that natural more so than contrived 
settings, and that physically unprotected more than protected settings. 
Where the possibility of testimonial evidence exists — as it usually does 
unless the procedure is constructed so as to provide anonymity from all 
persons involved in the research process — the problem deserves special atten- 
tion. 

Note should be taken here of a separate but related issue, that of 
dangers to disclosure by the access given through didactic use and in 
dissemination through agents other than those of scholarly publication. An 
Institutional Review Board may wish to grant approval subject to some con- 
straints on either mode of access. Didactic use of confidential information 
is common in teaching and training of research specialists and practitioners 
or in other forms of training. Where serious harm could result from dis- 
closure of information, it is doubtful that unique identification should 
ever be allowed in behavioral science teaching and training. The problem 
is a more difficult and serious one in bio-medical research where living 
subjects are used in training. The problem is a critical one since protection 
is generally afforded only to employees. Students, trainees, and others who 
are not employees ordinarily are not qualified for protection unless specifi- 
cally appointed as employees. 

Similarly, the sharing of confidential information where unique 
identification is possible with colleagues and its dissemination through 
forums and media must be carefully protected. Sharing such information 

25-62 



with journalists is particularly risky and its sharing for any public 
purpose such as law enforcement or regulation must be precluded and 
legally protected. 

Testimonial Evidence . Little need be added to the problem of 
testimonial evidence than has already been said. Given the special vul- 
nerability of testimonial evidence, viz . , that it cannot be totally er- 
radicated except under the most extreme of measures taken against persons 
(and means that must be morally repulsive to any scientific investigator, 
e.g., homicide), it presents special problems. 

The first problem is that of unauthorized disclosure and deliberate 
misuse by members of the research team or others who obtain unauthorized 
access. While there are some legal protections available in both tort and 
criminal law to sanction persons who deliberately misuse or disclose 
damaging confidential information, they are ordinarily weak remedies for 
those harmed and they do not provide any means of preventive control for 
those responsible for their protection in the research process. It is 
unlikely that reasonably effective preventive control can be provided 
institutional sponsors and principal investigators unless there are strong 
and specific legal sanctions against unauthorized disclosure and misuse 
that is inadequately protected by tort and criminal law. Such protection 
is provided for in the LEAA proposed Code of Federal Regulations (28 CFR 22.29 
and Commentary) . 

The second problem is that of compulsory disclosure through trial pro- 
ceedings and subpeona. Behavioral science research has proved to be in- 
creasingly vulnerable to the threat of subpeona (Nejelski and Peyser, 1975: 
EL2-B24; Nejelski and Lerman, 1971). Adequate protection in this respect 
would seem to be provided in the HEW proposed Code of Federal Regulations 

25-63 



that stipulates: "Persons so authorized may not at any time be compelled 
in any Federal, State, or local civil, criminal, administrative, legislative, 
or other proceeding to identify the research subjects encompassed by the 
Certificate except in those circumstances specified in paragraph b of this 
section." We shall have occasion to refer to those exceptions later. I 
make special note of the caveat "would seem" since these are complicated 
legal matters and the Code of Federal Regulations is itself subject to 
subordination by present and future Federal legislation on specific matters 
and the constraints of the Privacy and Freedom of Information acts. There 
is and will be case law and there are related constitutional issues. 

Just how much of the information on risks should be communicated in 
securing informed consent is problematic. A requirement of full disclosure 
to secure consent could be burdensome and have consequences for the reli- 
ability and validity of information. It follows that the more one is legally 
protected and the more sanctions that are available to forestall disclosure, 
the less specific information need be communicated. A simple statement . 
that states the form of protection that is available may often suffice, 
particularly when there is strong protection as with legal protections for 
confidentiality and disclosure. 

Potentially Chilling Effects of Full Information . Behavioral science 
investigators may well overestimate the possibilities of the chilling ef- 
fects that full disclosure of the information required by the elements of 
notice may have upon cooperation and the reliability and validity of informa- 
tion. The problem is not a single one since compliance with a full dis- 
closure rule may not only create a greater possibility for free choice but 
also raise unrealistic doubts and concerns that are damaging to free inquiry. 

We do not propose to discuss the problem fully here. We would simply 

25-64 



note that some matters would seem more important than others such as the 
necessity to inform about unique and exact identifiers and what protections 
are afforded for confidentiality. 

There is, nonetheless, one special problem that deserves attention. 
It is axiomatic that any form of regulation has possibilities for its 
evasion and that any form of protection has possibilities for leaving one 
vulnerable to harm. Both require brief comments. First, patterned evasion 
will inevitably develop among Institutional Review Boards and among 
Principal Investigators if requirements unduly constrain free inquiry or 
prove unusually burdensome. Second, current regulations now provide a 
possibility for leaving persons vulnerable and unprotected on evidentiary 
grounds. A single example may call attention to this. The requirement 
of a written and informed consent signed by subjects provides signature 
evidence and ordinarily provides fingerprint evidence as well. Were such 
evidence to be either secured by compulsion or otherwise, and since such 
evidence can be damaging to persons, that particular requirement has made 
for greater possibility and perhaps likelihood of harm! 

We note one other related matter in passing since we shall have recourse 
to consider it later. The bureaucratization of regulation may easily prove 
burdensome and lead to patterned evasion as well. A requirement, for 
example, that one keep a log of all persons who have had access to confi- 
dential records may readily lead to evasive tactics and more rule-making 
which in turn may generate evasion. 

(7) a_ description o_f any benefits reasonably to be expected; (45 CFR 
46.2:c-5) 

The elementary Human Subjects m.odel is predicated on the presumption 

that ordinarily participants in research are subject to some other form of 

25-65 



Intervention that is designed to benefit the participant directly. The 
research intervention is coupled with another form of intervention that 
is designed to do good. That model is largely inapplicable to most behavioral 
science research even when good may result from the research. This is so 
for a number of reasons. 

First, and it hardly bears repeating, most participants in social 
science research are related to investigators solely through the research 
role; there are few if any direct side benefits. 

Second, most behavioral science research has an interest in descrip- 
tions for aggregates rather than individual or corporate units and seeks 
generalizations at an aggregative level. Disaggregation to the point of 
unique identification is rarely useful for the dissemination of knowledge. 

Third, where benefits are possible, they ordinarily arise from the 
production of knowledge that will help an aggregate or class, of which the 
participants are only representatives. They are thus class rather than 
individual benefits. Benefits, moreover, often may not flow from a 
particular inquiry, except to the scientific community, since a particular 
benefit may flow only from the cumulation of knowledge. 

Fourth, in many cases, the benefits, therefore, are not predictable 
in advance and we would remind again that the same knowledge may bring both •■ 
harm and benefit. 

Fifth, the benefits from behavioral science research often are expected 
to redound to the sponsors of research. Most assuredly man> federal dollars 
are spent on behavioral science research not only because the government is 
operating in its general role of public interest and welfare but in its more 
special one of making policy and program decisions. Evaluation research and 
program research is expected to bring pay-offs in practice and in decision- 
making . 

25-66 



Put another way, the beneficiaries of behavioral science knowledge 
are generally principals and third parties. Investigators may be rewarded 
for discovery and additions to knowledge. The public interest may be 
served collectively. The sponsors may make practical use of the knowledge. 

Much behavioral science research has engineering, enlightenment and 
intelligence benefits only (Crawford and Biderman, 197 ). The most usual 
benefit is enlightenment for a scientific community and the public. It 
becomes an element: on the basis of which they can more intelligently relate 
to the problems before them, either as citizens, officials, or workers in 
some other role. Behavioral science knowledge has a special relationship 
to the making of social policies and serves therefore an intelligence 
benefit. The policy-maker utilizes the special knowledge to sense the 
problem and actions that may be taken. But it is only one of a number of 
elements in the formulation of public or private policy. A third use is 
its engineering benefit, its utility in direct use of application. As an 
example, a study of the use of a modus operandi file in police work may 
result in immediate changes in the structure and use of that file. 

There is, naturally, as in all science, a reasonable amount of what is 
called basic science research, the acquisition of knowledge that will make 
new knowledge or increase the production of knowledge. To forecast the 
benefits of a particular study to basic science is precarious at best. 



25-67 



FOOTNOTES 

i. There are some statutory limitations on consent where proprietary 
interests prevail or when exchanges are privileged. 

2. The more unplanned the intrusion into private matters, the more 
complicated are problems of "informed consent" and "protection of 
the sources of information," matters treated below. 

3. Note that I do not argue that we have a more legitimate claim to 
"truth," whether or not it is made in the name of scientific inquiry, 
but simply that our claim to science opens us to political challenge. 






25-68 



These matters considered, cost-benefit decision rules in decisions 
to grant or withold approval are both troublesome and inapplicable. A 
few additional issues are raised however with reference to the element 
of benefits in notice and these are now considered. 

Participation in behavioral science research often may involve 
benefits that are particularly difficult of measurement, viz . , psychic 
benefits. Studies of the old and retired, for example, often report the 
pathos of the pleasure that their attention brought to those who are all 
too often socially isolated and neglected. The psychic benefits of 
prestige, satisfaction, and a sense of achievement are open to exploita- 
tion in research but more often than not participants regard them as 
benefits. I do not know how they can enter in any precise way a cost- 
benefit calculus. Even were research to provide us evidence for inference 
and prediction, that research ordinarily is not available and not obtain- 
able without prior research on persons. 

Protection Against Disclosure . There is no explicit provision in 
the elements of notice to stipulate that participants be informed of the 
nature of the protection offered against disclosure. Presumably that 
matter is included among the risks one might stipulate. There is reason 
to maintain that it should be an explicit element of notice in securing 
informed consent. The obvious reason is that it is potentially very 
harmful. But there are other reasons. At least where confidentiality is 
at issue, as it is in any research where unique identification is an element 
in the design, there is a problem of special protection. All persons and 
corporate bodies have a right to know to what extent they are protected 
against disclosure whether or not the investigator defines the information 
sought as a private matter; subjects may regard it so. If they 

25-69 . 



do, there is even a potential side-benefit for investigators if protection 
is afforded: it may increase the participation rate and enhance the 
-validity and reliability of the information. Moreover, many studies must 
make representations about confidentiality. Institutional Review Boards 
ought to know what those representations will be. It seems intolerable 
to permit the extension of confidentiality when protection is weak or 
unaf forded. Both Boards and participants should be informed of protection 
against disclosure. 

The Extent to Which Notice is Explicit and Full . Apart from the 
problem of potential chilling effects already alluded to, questions arise 
as to how one will decide how explicit and how full notice shall be. What 
rules shall guide decisions about the form of notice. A criterion of 
reasonableness, for example, must be guided. 

The problem of information overload is a common one in information 
processing and research participants are also subject to information over- 
load. Overload may not only constrain free choice because it makes 
matters 'too clear"' or unintelligible but it may also induce compliance 
to unduly impressing some potential participants. When people do not 
understand, they do not always withdraw; they may want to find out more 
or believe that it is a good thing to go along with something which is that 
impressive. It is well established that given differences in levels of 
education and comprehension, a single form of notice must be intelligible 
to those with the minimum education and comprehension for the population 
under study. 

Forms of Documenting Informed Consent 

The elicitation of informed consent is primarily a matter of procedure 

25-70 



or method. Our previous discussion focused on the requirement that regard- 
less of specific nodes of elicitation and procedure, they must permit "free 
choice." To some it may be surprising that both recommended and approved 
federal regulation s for informed consent do not similarly permit the 
participant to chose whether or not the agreement to participate is 
documented by the participant in some form of unique identification . The 
decision that the participant or his/her representative if they choose to 
consent must give wri tten consent , unless the investigator is exempted 
from the requirement to obtain informed consent, is a restriction on the 
participant as well as the investigator and is a constraint upon his/her 
freedom to choose. 

The main reasons for requiring written consent are presumably 
twofold (though they are never made explicit) — they are a means to audit 
the conduct of investigators and as evidence they afford investigators 
legal protection. They may also afford participants similar legal protection 
if also signed by the investigator or an authorized representative and 
preferably attested to by a third party. 

It appears that no consideration is given as to whether the legal 
protection afforded by written consent should override a participant's 
willingness of free choice to participate without giving written consent. 
We note now, and shall discuss below, the fact that some other forms of 
documentation, such as testifying to the fact that a given person or 
corporate body gave informed consent, do not similarly constrain choice 
while affording the participant as full protection. 

The choice among options, e.g., written consent, testimonial/warrant, 
and testimonial cum complainant notice discussed next, involve choices among 
who deserves protection from what, who is to be regulated, and how is regula- 
tion by consent to take place. Specifically we shall compare each of the 

25-71 



three modes of documenting informed consent in terms of their grant of 

freedom of choice, regulation by audit and complainant mobilization, and 

protection through affording legal evidence. 

Written Informed Consent . The HEW Code of Federal Regulations 

(45 CFR 46.10) specifies the actual procedures to be utilized in obtaining 

"legally effective informed consent" by documentation. The documentation 

of informed consent must employ one of three forms: 

1). "Provision of a written consent document embodying all of the 

basic elements of informed consent. This may be read to the sub- 
ject or to his legally authorized representative, but in any 
event he or his legally authorized representative must be given 
adequate opportunity to read it. This document is to be signed 
by the subject or his legally authorized representative. Sample 
copies of the consent form as approved by the Board are to be 
retained in its records." (45 CFR 46.10: (a)) 

2). "Provision of a 'short form' written consent document indicating 
that the basic elements of informed consent have been presented 
orally to the subject or his legally authorized representative. 
Written summaries of what is to be said to the patient are to 
be approved by the Board. The short form is to be signed by 
the subject or his legally authorized representative and by an 
auditor witness to the oral presentation and to the subject's 
signature. A copy of the approved summary, annotated to show 
any additions, is to be signed by the persons officially ob- 
taining the consent and by the auditor witness. Sample copies 
of the consent form and of the summaries as approved by the 
Board are to be retained in its records." (45 CFR 46.10: (b)) 

3) . "Modification of either of the primary procedures outlined in 

paragraphs (a) and (b) of this section. Granting of permission 
to use modified procedures imposes additional responsibility 
upon the Board and the institution to establish: (1) that the 
risk to the subject is minimal, (2) that use of either of the 
primary procedures for obtaining informed consent would surely 
invalidate objectives of considerable immediate importance, and 
(3) that any reasonable alternative means for obtaining these 
objectives would be less advantageous to the subjects. The 
Board's reasons for permitting the use of the modified procedures 
must be individually and specifically documented in the minutes 
and in reports of the Board's actions to the files of the institu- 
tion. All such modifications should be regularly reconsidered 
as a function of continuing review and as required for annual 
review, with documentation of reaffirmation, revision, or dis- 
continuation, as appropriate." (45 CFR 46.10: (c)) 



25-72 



Setting aside the provision of a modified procedure, it should be 
noted that there are several significant oimnissions in the written informed 
consent attested to by the research subject and/or others. Brief mention 
is made of each of these since we shall wish to compare several modes of 
subject and investigator protection later. 

1). No provision is made for advising subjects or their representa- 
tives as to who retains the signature document, for what purpose, and of 
how it will be protected and used. 

2) . No provision is made for documentation of refusal to grant 
written and informed consent and of how that is to be protected and used. 

3) . No provision is made for documentation of withdrawal of informed 
consent once given and what rights, if any, the person has in information 
provided prior to that point. 

4) . No provision is made for the document to be signed by the person 
officially obtaining the informed consent in all options (altogether absent 
except in the abbreviated form of consent) or for location, date, and time 
that consent was obtained. 

5). The provisions are silent on the matter of who retains the signed 
informed consent document. The investigator is not obligated to provide 
a copy to consenting persons and if only a single copy is refused, as 
appears to be the case from the language of the regulations (e.g., 45 CFR 
46.10: (a) "This document is to be signed by the subject or his legally 
authorized representative. Sample copies of the consent form (italics mine) 
as approved by the Board are to be retained in its records.") it apparently 
is to be retained in the principal investigator's files for purposes of 
audit. 

6) . No provision is made to advise persons who will have access to 

25-73 



to the signature document and for what purposes. Under federal law, federal 

auditors may have access to such documents for purposes of audit. Their 

powers probably include a right to inquire of persons whose informed consent 

was obtained or consent refused, i.e., whether or not specific procedures were 

accomplished or specific information obtained, though not what a person said 

or how he behaved. The extent of auditor powers over information on informed 

is a matter 
consent^ for further clarification, however, and perhaps provision should be 

made to limit it, should they appear to be overly broad powers that intrude 
upon privacy and thereby fail to adequately protest both subjects and investi- 
gators. The proposed LEAA regulatory code stipulates that revelation of all 
"research and statistical information identifiable to a private person may be 
revealed on a'need-to-know basis only to: 

"a) Officers, employees and subcontractors of the recipient of 

assistance; 
"b) LEA/, staff; and 

"c) Persons or organizations for research or statistical purposes. 
Information may only be transferred for such purposes upon a 
clear demonstration that the standards of Section 22.26 have 
been met, except that when information is transferred to persons 
other than LEAA or project staff, that the transfers shall have 
been conditioned on compliance with a Section 22.24 agreement." 
(28 CFR 22.21) 
The above provision makes explicit that for all LEAA grants at least 
three categories of persons or organizations may have access to information 
provided by informed consent without the explicit consent of the participant 
provided that there is a "need-to-know." While there are rules and precedents 



25-74 



that govern "need-to-know" and the proposed LEAA code provides rather full and 
explicit guarantees of protection of the information through the confiden- 
tiality certificate, transfer agreement, and sanctions provisions, it seems 
possible that since auditors or staff persons are not covered in the regula- 
tions by the provision for immunity from legal process, some possibilities 
for compulsory disclosure still exist. In any case, federal regulations 
cannot preclude federal auditor powers, absent explicit legislative restric- 
tion on such powers. There would seem to be good reason to seek to restrict 
such powers, at least to protect any confidential identifiable information 
so obtained from authorized or compulsory disclosure. 

At the same time, any regulations should make explicit the classes of 
persons, including those of the sponsoring institution, that shall have 
access to confidential information and the conditions pertaining thereto. 
Should Institutional Review Boards, for example, in connection with their 
review and regulatory powers have the right to information on who did and did 
not grant informed consent and whether the provisions of protection and con- 
fidence were fulfilled by the principal investigators? Given their powers 
and responsibilities for approving projects and continuing review (45 CFR 
46.6 and 46.7: (g)) they may have a right to such access; if so, it should be 
clearly stipulated by regulation. 

Special Features of Documenting Informed Consent by Signature . A number 
of special problems arise in documenting informed consent by requiring all 
consenting persons or their legally authorized representative to attest to 
their consent by signature on the written consent form. These special 
problems need not arise for some other forms of consent. 



25-75 



Documenting informed consent by participant signature (written consent) 
makes the unique, identity of each participant known t o an investigator , even 
when the object of all other procedures is to insure anonymity to all parti- 
cipants. Having information on the uni qu e identity of participants creates 
ipso facto (ipso jure) a problem of protection where confidentiality char- 
acterizes the information . 

There are three principle ways that investigators procedurally protect 
from disclosure the identity of participants and the information they provide 
by selecting different accessioning and eliciting procedures. 

First, investigators can protect the identif lability of information by 
the manner in which they procedural!}' accession participants . Procedures for 
accessioning participants range from accessioning subjects by techniques that 
make it impossible for anyone other than the subject to know he/she is a par- 
ticipant — the anonymous participant — to procedures that provide for their 
unique identification. Regardless of what accessio n ing proc e dure is use d, 
however, the requirement for documentation of informed consent by signatur e 
makes it altogether impossible to have anonymous participa nts. The. require- 
ment is a burdensome restriction since it forces investigators to protect the 
anonymity of participants in research when disclosure of their identity as 
participants is in itself potentially harmful and the investigator could 
protect by anonymous participation. 

Second, the procedure for eliciting information can be anonymous. Under 
these conditions the documented informed consent by signature poses no problem 
of uniquely identifiable information. Whenever there is no reason why the 
unique identity of participants should not be made public or when it is indeed 
public knowledge, documentation by signature provides protection to investigators 



25-76 



in accessioning participants and they may still protect anonymity by their 
collection procedure. There are, however, distinct limitations to anonymous 
data collection techniques since they preclude certain types of study design 
and forms of analysis. 

Third, when both the accessioning and elicitation procedures provide for 
unique identification, protection can be afforded by separating both the docu- 
mentation of informed cons ent by s i gnature and any other identifiers that m ay 
permit unique identification of information from the information i t self . 
Whenever consent is documented by signature, however, the evidentiary problems 
in separation are compounded so long as one is required to maintain the docu- 
mentation of consent. 

It is unfortunately the case that modes of accessioning participants in 
behavioral science research often must provide for knowledge of one or more 
identifiers that can lead to unique identification. For example, if one wishes 
to have a random selection of households or persons in the United States, one 
must obtain information on certain identifiers such as an address and on 
household characteristics to select a respondent, e.g., "head of household." 
Since in a given unit at a given address, the head of household is often a 
unique identifier, e.g., in one person households, a single identifier can 
provide unique identification. 

Yet it is well to bear in mind that in much behavioral science research, 
our interest does not lie in these identifiers as a means of identifying 
unique individuals but in social aggregates. The identifying information is 
incidental to the participant accessioning or data collection procedure. To 
return to our examples, we do not select an address or a phone number by 
random means to know whom we are uniquely getting information from but to 

25-77 



insure that in the aggregate we are getting information from participants who 
represent classes of participants or who in the aggregate will describe the 
universe of participants in which we are interested within a given range of 
error of estimation. 

Although our procedures then may make it necessary to collect information 
that falls in the class of identifiers that individually or collectively may 
lead to unique identification, any procedure of securing consent should not 
invariably coerce the collection of a unique identifier such as a signature. 
Ordinarily unique identifiers should not be obtained in behavioral science 
inquiry unless they are essential to the inquiry or its design. Since they 
usually are not essential, documentary evidence of unique identity is burden- 
some since it increases risk of disclosure and correlatively the need for 
protection of confidentiality. That risk of course can be balanced by forms 
of legal protection, but the only certain way to protect is not to document 
accessioning participants and their informed consent with unique identification 
procedures. 

When an investigator seeks to protect participants by anonymous means of 
accessioning participants or data collection, although informed consent should 
be obtained, the requirement of documentation of that consent by participant 
signature should be waived altogether. The other requirements of abbreviated 
consent, moreover, should be obviated, particularly the provision "of con- 
siderable immediate importance," since that is rarely applicable in behavioral 
science inquiry. 

The written informed consent is not altogether practical in using some 
techniques of data gathering. The consent to enter a private place to con- 
duct an interview may be necessary before any written informed consent could 
be obtained, for example. Such field setting difficulties should be sufficient 

25-78 



grounds for waiver of some requirements of consent. 

As noted previously, organizational behavior research raises questions 
about the obligation to secure written informed consent. Where evaluation 
research is a matter of formal contract, the consent of employees may not be 
required, e.g. observation of them at work. Likewise documentary research 
poses special problems, particularly in matching records. Apart from the 
fact that it is impractical to obtain written and informed consent from the 
deceased or from those who have moved or otherwise cannot be located, the 
risk of disclosure or harm in their use ordinarily is no greater than that 
arising from their retention by the original data source. With guarantees 
of confidentiality, risk should ordinarily be so low as to preclude require- 
ing any form of consent for the use of many kinds of records. 

Informed consent that is written and documented by signature can serve 
to constrain refusals to respond or withdraw from participation. There is 
some evidence that signing any consent document makes it more difficult to 
break a trust relationship or agreement. The basic fiduciary element in 
any contract is not that easily broken — no matter how fragile it may seem 
in a modern world — and some participants will find it harder to break the 
relationship of commitment than others. It is well to remember that to bind 
investigators to do right may also bind their subjects so that they are less 
rather than more free. An informed and documented consent has such elements. 

Subjects or participants, moreover, often are willing to consent but 
not sign . Signatures arouse suspicion and affect the willingness to partici- 
pate. Whenever a unique identifier is not essential to the research study, 
it seems burdensome to require a signed or third party attestation proce- 
dure since on the average it will reduce the participation rate (a source 
of error) and may affect the validity and reliability of information (other 

25-79 



sources of error). Above all, it becomes more difficult to control and 
measure error in estimation for aggregates. 

Finally we shall discuss some problems that arise from the require- 
ments for a modification of the written informed and signed consent 

4 
procedures. The requirements that an abbreviated procedure meet the test 

of that "either of the primary procedures for obtaining informed consent 
would surely invalidate objectives of considerable immediate importance, 
and that any reasonable alternative means for attaining these objectives 
would be less advantageous to the subjects" (45 CFR 46.10: (c)2(3)) derive, 
it would seem, from the bio-medical Human Subject model (where they surely 
may be appropriate) rather than from a behavioral science model. We call 
attention here to their inappropriateness when one or more of these condi- 
tions prevail, a condition of common occurrence in behavioral science in- 
quiry. First, often proof is lacking that the primary procedure would 
invalidate the objectives of the study in the particular case of which the 
application for approval is an example. Second, the objectives of behavioral 
science research are only rarely of "considerable immediate importance." 
And, third, since the means used often are of no particular advantage to 
subjects, the consideration of alternatives is usually inapplicable. 
Indeed, the only condition of the modification procedure that is ordinarily 
applicable to behavioral science research is "that the risk of any subject 
is minimal" (45 CFR 46.10: (c)-l) . It should ordinarily, therefore, be 
the only condition required for modification. Other conditions that are 
not stipulated seem more applicable in permitting modification of written informed 
consent procedures in behavioral science inquiry such as permission to 
modify when the need for information on unique identifiers is absent or 
they do not need to be retained after the participant is located for the 

25-80 



data collection phase of the Inquiry. 

Investigator's Signed Testimonial or Warrant of In f ormed Consent . We 
have noted that the signed written informed consent procedure currently 
governing HEW sponsored research lacks some protections for subjects or 
participants in research. Some of these would be added to current proce- 
dures but others cannot be added without changing its form to that of an 
investigator testimonial or warrant. It is to this form we now turn. 

Both participants and investigators require some form of protection 
in behavioral science research. In the matter of informed consent partici- 
pants must be informed particularly of risks and/or benefits so that they 
may make a free choice about participation. Investigators must be protected 
from being compelled to be agents of harm toward subjects as a consequence 
of their participation in the research. The consent form should do both 
these things but on balance insure the rights of participants more than 
those of investigators should choice among them be necessary. Neverthe- 
less, a procedure which more nearly balances both participant and investi- 
gator rights and their protection should be optimal. The investigator's 
testimonial or warrant of informed consent should be preferred to the signed 
written informed consent on the following grounds. 

In the investigator's written testimonial or warrant of informed 
consent, the investigator or his/her agent warrants that a person or corporate 
body or their representatives have been advised of the required elements in 
securing their informed consent and it was granted. The basic elements are 
these: 

1) A written document that is approved by the Institutional Review 
Board embodying all of the basic elements of informed consent; 

2) A statement of whether the document was read by_ the named partici- 
pant and/or his/her named legally authorized representative or 

25-81 



read _to_ either party by the named investigator or a named auditor 
witness; 

3) The name and address of the consenting party or legally authorized 
representative who consented or refused consent; 

4) A statement of any additional conditions agreed upon and (at 
option) any modifications agreed to during the elicitation of 
informed consent; 

5) A statement: of whether any record is made providing unique 
identification as a matter of record as in visual or voice re- 
cording; 

6) A statement of how the information is to be analyzed and 
disseminated and the information stored (perhaps optional if 
there is aggregative reporting and full protection for confiden- 
tiality of information — though the Institutional Review Board 
must grant such exemption) ; 

7) If the principal investigator is not the person soliciting the 
form of the consent he/she shall be named as well as the institu- 
tional sponsor who also signs as the authorized representative 
(the authorized representative should have reasonable proof of 
his/her authorization and provide this identification in securing 
informed consent, but in any case shall be obligated to furnish 
reasonable proof in the course of eliciting informed consent if 
it is requested by any party to consent) ; 

8) A statement of whether any legal protection is afforded the 
participant by this study such as a confidentiality certificate 
or by sanctions against misuse of information. Where no such 
protection is afforded the participant and it is concluded that 

25-82 



"more than ordinary" risk of harm is involved should confidence 
become public information, the participant must be advised that 
the investigator is not able to afford protection against compul- 
sory disclosure; since tort remedies are possible for misuse of 
information, it is perhaps not necessary that the participant 
be advised of them; 
9) Whether consent was granted or refused and whether there was with- 
drawal following consent (the participant permitting) should be 
recorded; 

10) A copy signed and dated by the principal investigator (either as 
the party eliciting information or performing procedures or as 

• granting a particular agent authority to do so) and by the 
representative authorized to secure consent or perform other 
procedures and by the participant or his/her legally authorized 
representative when consent is granted shall be give n to all whose 
consent is elicited including those who refuse to grant consent as 
well as those who grant it. The form shall include the name and 
address of the institutional sponsor and of an authorized represen- 
tative for the institution; 

11) The investigator shall keep no record with unique or other identifiers 
when a subject refuses to grant consent or later withdraws it unless 
there is full legal protection against compulsory disclosure and 
misuse of information. Where informed consent is granted the 
investigator shall not refuse subjects to provide their signature 

of consent unless the participant has been advised there are 

specific risks of harm of which he/she is advised in securing 

informed consent. When an investigator retains a signed written informed 

25-53 



consent document there must be reasonable protection against dis- 
closure of that evidence; 

12) Where any procedure in the research intervention other than the 
elicitation of infomation is performed on the subject by other 
than the person eliciting information, there shall be a stipulation 
that the principal investigator agrees to provide full information 
on who performed such procedures at any time that the participant 
or his legally authorized representative requests. Moreover, the 
participant is entitled to information of the name and address 

of any person who had authorized access to confidential informa- 
tion and of unauthorized access, if known to the investigators. 
The investigator therefore is obligated to keep a record of all such 
interventions and who performed them insofar as they are specifi- 
cally for purposes of the research only (some investigators are 
legally compelled to keep such records in any case) ; 

13) No member of an Institutional Review Board shall have access to 
unique identifiers or uniquely identified information unless they 
enter an explicit agreement with the principal investigator to 
protect the confidentiality of such information and where ap- 
plicable become subject to any provisions for sanctions against 
misuse or disclosure (28 CFR 22.29, for example); 

14) When specifically requested, the principal investigator must 
provide information on the specific government agency sponsoring 
the research, its address, and the designated officer signing on 
its behalf. 

The Investigator Written Testimonial or Warrant With Provision for 
Complaint . Although one may regard the testimonial or warrant form as 

25-84 



providing sufficient information and a legal document for formal litiga- 
tion, provision may and perhaps should, be made to provide for less formal 
modes of adjudication of complaint. In any case, whenever the procedure 
involves actual or potential risk of harm, participants should be specifi- 
cally advised of that "right" or "possibility" including the following: 

1) A statement that if they wish to lodge a complaint or secure 
further information en the particular inquiry (e.g., pending 
further participation) they are given the following information: 

2) The identity of the investigator and/or the legally authorized 
representative who secured consent and of the institutional sponsor 
and its representative as the agents who should be contacted un- 
less the Institutional Review Board chooses to designate it- 
self or other agent to secure such complaints. Complainant rights 
are more fully protected when the complaint is lodged with a dis- 
interested party. Principal investigatory and their agents are 
not disinterested parties. Therefore, unless it is unduly bur- 
densome, an Institutional Review Board or its agents should 
receive such complaints and provide for some means of their review 
and adjudication (including, of course, involving principal in- 
vestigators) . The institutional sponsor is not always a dis- 
interested party and for some special kinds of research, the 
government sponsor of the research may chose to require that it 

be designated the agent to whom such complaints should be directed. 
In any case, all investigators or their agents must provide such 
information on specific request as required in the testimonial form. 
We would note parenthetically that in all cases there should be 
some means provided for dealing with complaints whether of principal 

25-85 



investigators, institutional sponsors or government sponsors of 
research. 
3) There shall be a record made and retained for a reasonable period 
of time following the conclusion of the inquiry of all such com- 
plaints received and any actions taken thereon. Such records 
when kept by principal investigators, shall be accessible to the 
Institutional Review Board and at least some record of them and 
actions taken on theiu kept in its files. 
If one intends to protect a subject's right to protection in research 
inquiry, it seems essential that the compla int p rocedu re be followed whether 
or not consent is documented by the written and participant signed informed 
consent of the testimonial or warrant signed consent form. Moreover, written 
notice of when complaint can be lodged should be the minimum provided when- 
ever informed consent must be secured. The reason for this seems obvious 
enough. Most participants may either err in this acquisition of information 
about whom they are dealing with in the consent procedure or fail to retain 
or recall the requisite information essential to lodging a complaint. The 
obligation in all cases should fall an investigators to provide information 
documenting how complaint can be lodged in the form: "if for any reason 
you wish to know more about this study or complain to others about what was 

done to you, call or write to ." The form of notice should 

in all cases be intelligible to all literate persons. 

Should only the complaint form be required, provision should also be 
made within the written form to advise the participant that information on 
the name and address of any person who had contact with the participant in 
performing any procedure connected with the research intervention or had 
authorized access to any confidential information pertaining thereto or 

25-86 



of unauthorized access, if known, will be provided on request. There 
likewise should be an obligation to provide information identifying the 
government sponsor on specific -request 

Comparison of Modes for Docu men ting Inform e d Consent . We turn now 
to compare the relative advantages and disadvantages of the presently 
authorized mode of documenting informed consent and the proposed model. 
Chart I summarizes these comparisons. Little comment seems called for 
since the comparisons should be obvious to the reader. Before turning ro 
some summary observations and conclusions derived from the comparisons in 
Chart I, we return to review the prototypical bio-medical Human Subject 
model and the prototypical behavioral science model as they bear upon these 
comparisons. 

In the choice of models for securing informed consent it is well to 
bear in mind that the prototypical bio-medical model and the prototypical 
behavioral science model have both different "harm" probabilities and 
different harm points in the research process. Quite typically the bio- 
medical model involves some risk of harm both from administering the ex- 
perimental or treatment procedure or by delayed effect and such effects 
are ordinarily mentioned throughout the period of research inquiry. Al- 
though similar conditions prevail in some behavioral science inquiry (more 
likely so in research by psychologists than others) and they are more likely 
to arise in the use of some techniques (e.g., social or psychological ex- 
periments or longitudinal studies) , the prototypical behavioral science 
model involves virtually no risk of harm from or during the data collection 
phase and minimal risk from the data processing and analysis phase. The 
behavioral science model, moreover, incorporates corporate as well as 
person actors. Risk to person or corporate actors in a research inquiry 

25-87 





60 




d 




vl 




4-1 




1-4 




U 




•H 




t-l 




w 




S-l 




X! 




U-J O 




u 




to cd 




cu cu 




•O CO 




O 0) 




S iC 




u a 




n -h 


M 


cu 


H 


E ■" 

5 C 


FS 


o cu 


<-< 


o to 


sd 


O d 


u 


o 




<u u 




cu 




s< 73 




Xl 0) 




H G 




Vi 




<4-l -o 




o »w 




c 




C H 




o 




co 




•H 




S-i 




B) 




a 




E 




o 




u 





*j 










to 








d 






- 




E 








cd 










u 








ti 










o 








73 -H 




o 






<U >4-l 








d cd 




4-1 






>> 0) r-l 








to t-i 










i-l > Xl u 








p. 




cu 






C -r-4 -r-l CU tO 








C E 




a) 






4-i co M-i .d to 








0) o 




w 






cd to t-i 4J cu 








4-1 O 




U-4 






H 4-i o o d 








4-1 










Pj d Pu 60 4-1 








■H o 




4-1 






cu d E ••-• 








l-l 4-1 




o 






tW CO to «H o JJ 








13 




d 






O OJ 4-1 4J (J 








0) 










!-l d t4 «4-l U 








1-1 o 




S-l 




Q 


cu &. -i-i j-i o 


>. 






O •■"! 




o 




W 


M 0) l-l S CU 4J 


S-4 






4J 4-J 




4-1 




OS 


3 M ft. 'O 4-1 t4 


n 






cd O 




cd 




M 


4-1 H d Cd T3 


CO 






60 J5 




60 




& 


cd ij cu cd s-i 3 


CO 






•H 




•H 




C 


d o 60 x; k) cd 


CU 






4-1 73 




4J a) 




w 


60 d ft 


o 






to a) 




to CO 




PS 


•■-I M -H o CU O 


cu 






4) C 




CD o 






CO P4 t-i g; CO 3 


H 






> 60 




> o 




H 










a 1-1 




d .d 




O 


/^» /*N /-N •— N 






a 


M to 




H o 




•z 


r-l CN1 CT <|" 










O 




& 
o 


13 













4-1 




u 


4) J.J 






•H 


S-l 








tH 


to 1-4 O 


CO 




4-1 


U 




01 






X! & i-l tH 


CO 




cd 


o 




<y 




4-1 


4-1 o 3 


CU 




4-1 


4J I-l 


4-1 


u 




S-4 


o cd cr i-i 


a 




C 


cd to 


d 


<-u 




eg 


XI i-l t) O 


4-1 




0) 


60 -H 


0.1 


73 




ft 


1-1 4-> 


4-1 T-I 




E 


•H CI 


to 


d -u 




(0 


t-i j-i co co cd 


d B 




3 


4J O 


d 


cti O 






O O ^ -H 60 60 


<U 




o 


to E 


o 


d 




T3 


4-1 d d d -r4 


E t-i 




o 


CU T-l 


CJ 


S-l 




CU 


qj cd cd -r-i -i-i 4-i 


•a o 


>. 


R 


> 4-1 




O 4J 




M 


>-l 60 ft U 4-1 CO 


d 4J 


S-i 




d to 


in 


4J d 




•r4 


3 tH -i-I ft Tl CU 


0) T-I 


cd 


t-i 


m a, 


o 


cd et» 




3 CU 


4-1 4J O U U > 


E r o 


Ui 


o 


H 




tO ft 




cr u 


cd co t4 cu t= d 


cd 3 


CO 




c 


4-1 


•i-l i-ICM 


CU 3 


d CU 4-i 60 t3 -i-l 


cd 


CU 


a) 


<U 73 


d 


4-1 O 


CJ 


U 4-1 


M) > i-i d c: 


^ 


o 


73 


4J a; 


a 


00 -H 


CO 


cu 


•H d pj t4 cd '-M 


d o 


cu 


p 


4J c 


u 


CU 4-1 


o 


x: d 


c/j i-l ft Vt-i jl; o 


cd 23 


d 


^ 


T-I to 


u 


> U 


o 


i-l to 










Sj -h 


ij 


d rd 


rf 


O i-l 


* — ^ /-N •-N 


rf-~* 






^ w 


i— 


M P-l 


"o 


m co 


r-l W co 


<J 
















-a 






4-> 










CO 


d 






e 




O 






4-1 73 


cd 






W 




4J 






i d cu i-i 








a 










•H -A U O 


o 






•r-l 




0) 






4-1 »-l T4 t-4 


(-> 


E - 




o 




01 






I-i ft 3 


D 


o 




•H 




!-< 






cd j4 cr i-i 


4-1 


S-l to 




4J 




>4-l 






ft dl 01 o 


cd 


UJ CO 




U 










60 S-4 4-i 


4J d 


0J 




flJr-l 




4-1 






inn d cd 


d 60 


to d 




ft 4-1 




o 






O !>> i-l 60 60 


CU -H 


4-1 4-1 




CI 




d 






!-< <4^ d -H 


S to 


C -H 




x: <u 








ra 


0J d t4 4-1 


X! 


•r* & 




4J CO 




4-J J*, 




w 


1-1 O CD 4) CO 


d CD 


l-l 




t4 C 




d r-l 




PS 


3 i-l X! i-l 0) 


CU i— 1 


ft U 




& O 




tj d 




M 


4J 4.J XI 4J IJ > 


E ^ 


u o 




u 




P. o 




3 


co d t-i o fe d 


co •!•< 


Q) 4-1 




rj 




T-( 




C 


d cd to xi ti -i-i 


CO 


60 t-I 




CU 73 




a a) 




w 


60 ft to d 


>^ CO 


d 73 




4-1 (J 




■H to 




PS 


1-1 t4 O t-I CO CM 


d o 


•rl 3 




4-1 d 




4-1 o 






W O P-i o "X. o 


cd ft 


CU CO 




•H 60 




u o 




H 










iJ i-l 




tO jC 




O 


/— \ s~\ s—\ 


^*N 






t2 to 




ft u 




% 


r-l CM CO 


<!■ 










4-1 


















C 


















0) 




d 














E >* 




o _ 














3 Xi 




•H E 


1-1 • 












o 




4-1 O 


01 cu 












O 0) 




cd P 


X! 3 












73 S-4 




U <4-l 


4-i a* 








C 




a 




•t-4 


O t4 








O M 




O 73 




<4-4 4J 


d d 








T-t O 




4-1 cl) 




•H M 


l-i 3 O 








0) t|-| 









4J CI) 


O -i-l 








—1 




o> o 




d D. 


T-I 4J 








> (U 




o u 




CU rt 


T) O tO 








o *o 




i-l Pu 


0) 


■O cu 


CU CJ 








S-l CO 




o 


M 


•i-l 73 (J 


4-1 CU -i-l 








^ s 




J2 4-1 

o d 
0J 

0) to 

cu d 
1-1 o 
ft, o 


3 

4J 

w 
d 

60 
•H 

to 


Unique 
require 
s ignatu 


Documen 
evidenc 
identif 







25-88 



1 

B) 


















g. 


















O d X> 


















IM •(-• 0) 


















C ICO 


















•i-l r-4 'i-4 


















p. > 


















-a g o 


















cow 


















o) CJ P. 


a) 




13 












CJ QJ 13 


d 


-a 


<u 












CJ l-l o 


o 


CU 


4.J 


13 




13 


Q) 




•rl 0) 60 


IS 


u 


d 


<D 




QJ 


r-l 




4-J XI 13 




•H 


oj 


W 




t-l 


Xl 




O £ O 




d 


g 


•H 




•■-4 


CO 




C! r-l 




cr 


D 







O 




d 




0) 


o 


cr 




cr 


•r-l 




r-i o oj 




S-l 


o 


cu 




0) 


1-4 




f. C -° 
Sj ^ 






•a 


S4 




S-i 


P, 






a 










P: 






d 


4-J 


4J 




4-1 


CO 




O -i-i co 




J2 


,0 


o 




o 


d 




fn w g 




J3 


<j 






25 


M 




■»-. u 


















to 3 


















Q) -r^ £) 


















3 J3 




S4 


1 












tr -a 




o 


p 












•H rQ CJ -a 




<4-4 


CJ S-I 












d d tj a 






o o 












3 !J >rl T3 




13 


13 4J 13 


13 




13 


13 




> 'H 


to 


d 


•H QJ 


OJ 




01 


CJ 




d m a o > 


w 


cd 


CD 13 S-l 


J4 




J-i 


i-l 




o P-i o '<•' o 


:j 




> d -i-i 


•r4 




■r4 


•H 




.c a. u 


I— 1 


k» 


•H nj 3 


d 




D 


3 




d u-i js ex 


d 


Xl 


4-i cr 


D' 




cr 


D H 




o o aj 


3 




ctf o a) 


CJ 




OJ 


CJ 




•H 4J --4 0) 




Tj 


iJ d sj 


(^ 




P-i 


Pi 




•!-> >> d X> CJ 


'd ID 


cd 


d 












« 4J oj cd v-i 


01 i) 


CI 


0) •'■ CO 












£ -H . to CJ 4-1 


S-i 4-J 


Si 


CO 13 CO 












fj JJ (\> T-4 O 


•-4 0* 




CJ II) CD 












O B i-l d 


d a 


G 


M J ' d 












H-l (!) J-i CX 


C o 


o 


p. d 4-1 












fi'O « RO 


CD X 


d 


cu w •■-( 












M -H ,c « d 


« a 


*-* 


(^ B & 












r O 


















a 


















s-i 


















•r-4 






f^ 












U 3 






1-4 












o cr 






d 












<u 






o 












d m 


















o 






CO 






13 


13 




•H 0> 






CO 


13 




CJ 


OJ 




U O 




•o 


0) 


OJ 




!-i 


I-l 




S -H 




o 


c 


f-4 




•r-l 


•H 




4J 4-J 


oj 


Si 


4-J 


•t-l 




3 


p 




d o 


d 


•r-4 


-,-i "0 


d 




V 


cr 




<u d 


o 


3 


£: 0! 


tr 




OJ 


OJ 




g a 


2. 


cr 


4-J 


a> 




Pi 


PS 






CJ 


14 d 


cc 










CJ OJ 




I-l 


o ai 






4) 


4J 




O *-> 






44 i'i 






,q 


O 




13 4-i 




0.1 


•H fj 






f^A 


« 




•f-l 




d 


13 O 












5 M 




o 


d o 












S3 & 




53 


<; 13 
















60 


00 












4J 


I 


C 


d 










d 


d 


rt 


•r4 


•r4 








d 


o 


tfl 


4-J 


4J 


4-1 








o 


T-l 


c 


d 


d 


d 


u 






M-J d -H 


i>> 4J 


•H 


a) 


0) 


QJ 


"4-1 O 




y-i 4J 


O O 4-J 


r-l CO 


co 


g 


E OJ 


§ 


O 4-1 




o d 


•r-4 O 


<D S 


i-i 


3 


d m 






OJ 


CJ JJ UJ 


3 S-i 


a 


O 


o d 


o u 


4-J CO 




4-> {2 


S-i 03 4-J 


cr o 


g 


o 


o -0 


o o 


d 4-i 




d ai 


3 O o 


•r4 lt-4 


o 


•13 i-l 


'O 0J 


•O CO 


oj d 




CJ OJ 


CO T-l S-4 

O "4 Pi 


d d 


o 


cd 


o 


a) co 


E oj 


13 


E u 


d> i-i 




S-I CO 


Sj o 


U > CJ 


oj b 


OJ 


CJ 00 


!-H T4 




U-l 


O 3 


o u 


o --I d 


4J OJ 


e 


4-J CO 


O 4-1 13 


U QJ 


o d 


H-4 14-4 


<4-i a. 


U-4 4J 4-1 


CO r-4 


S-i 


<0 


co d c 


O r-l 


o 


a> 




CO -i-l 


JJ OJ 


o 


4-1 I— 1 


•H OJ CO 


y-i xi 


d -i-i 


d M 


d m 


Cu J 


CO 


t+J 


CO CO 


13 13 


cfl 


O 4-1 


o 


o d 


o d 


13 


d 


c 


M -O 


•a t4 


i-i cd 


•i-4 «-! 


t4 T-4 


•H <y m 


d oj 


■i-i 


d o 


d oj 


QJ <J-4 


en S 


CO o 


CO £ 


CO CO o 


QJ V4 




CJ T-l 


a 1 cj a 


•a tj 


•.-4 H 


•i-l 


•r4 (J 


•rl flj 4J 


4-1 'i-l 


00 


4-J 4-1 


4-1 3 T-4 


■1-4 4-1 


> o 


> d 


> o 


> U "r-4 


4-1 3 


d 


4-J tJ, 


4-i cr to 


> C 


O U-l 


o o 


o "W 


o P-*d 


•H cr 


•H 


■r-l X) 


•r-4 t4 4-1 


O OJ 


f-i d 


t-l -1-1 


£.s 


ij oj d 


U OJ 


OJ 


>-l 13 


S-I d OJ 


S-J 13 


P-l 1-4 


P-I 4J 


Pi ti tf 


IS t-l 


x> 


g: ctj 


SPrt 


P-i 1-4 



25-89 





cu 






d 




•a 






o 




«o 










E 






cu 


J*> 








X) 


I-I 


d 






•r-l 


d 


o 




CU 


> 


o 


•rl 




r-l 


o 




co 




X> 


u 


4-1 


•H 




cd 


p< 


d 


> 




o 




<d 


O 




•rl 


o >-. 


•rl 


£ 




i-l 
Pi 


4-1 I— 1 

d 


O 






P-. 


XI O 


•rl 


u 




d 


cu 


4-1 


•rl 




d 


4-1 4-1 


rl 


lW 




r-l 


cd co 


<3 


•r-l 






60 CU 


CM 


o 
a 






•rl 3 

r-l XT 


>» 


ft 






XI CU 


CO 


co 






O r-l 








d 










o 


d 


x: 






4J T-l 


o 


4-1 






•rl 4-1 




o 






o cu ns 


cu 


JO 


d 




r-4 -rl fi 


■d 

•r-l 


u 


o 




G. > O 


> 


o 


■H 




r! 0*l 


o 


xi 


co 




H M d 


u 


■u d 


■rl 




P. -r-l 


(X 


d td 


> 




O 




<d M 


o 




4J O r-l 


o >^ 


P- 4-1 o 


u 




4-1 Cd 


4-1 r-l 


•H d 4-J 


S 




xi -r-i 


d 


o cd cd 






OJ 4-J 4-1 


TJ O 


■h a, to 


u 




+j d d 


CU 


4J t4 -H 


•r-l 




o a cu 


4J 4J 


V' C.I J> 


IM 




"H g 'O 


cd CO 


Rl •< 1 CO 


•.-1 




rJ CU -rl 


60 OJ 


fm 4J a 


O CI 




4-1 CJ IW 


■rl 3 


VJ > 


a 1 13 




U rl C 


r-l cr 


!^ C3 d 


ex. ,-3 




CU 60 o 


X) cu 


CO Cu r-l 


M g 




pi < o 


o u 








>, 










r-l 










r-4 


T3 








cd 


CU 








cu -a 


to 


•"N 






r-l Cd 


•H >, 


XI 






o o 


!— 1 CO r— 1 


0) 






M 


x d d 


■r-l 
r-4 

P. 


r-l 




4-1 XI 

d co 


o o o 

•r-l 

CU 4-1 CU 





r-l 




td 


O CO J-l 


*!-) 


Cd 




P3 XI CO 


d CU 3 


^^ 


O 




Pi rJ 


"H ."3 "d 




•rl 




H U J 


(0 C7* CU 


S*> 


4-4 




3 SB 


o 


t-l 


•rl 




4-1 Xl O 


CU !j O 


d 


CJ TI 




O & 


13 CU r-l 


c 


CU CD 




x> 


cs ts a. 




p-X) 




co cu & 


E co 


H 


CO >rl 




4-1 d a 


a 4J 


Pi 


> 




JZ "rl -r-l 


CU cd 3 




4-> O 




60 4-1 > 


d o 


>-. 


o u 




•r-l CU CU 


o o xi 


CO 


Z Ph 




Pi X> U 


2 4J B) 




1 




TJ 






g 




" CU & 

cd "H cu 


i 

>> !-l 




fit) c 




g IH "H 
H "rl > 


CM O 




O 111 * 




•rl CO 




"S g£ 




O 4-1 CU 


4J B 






•h d a; 


d o 




H o 


4-1 


d cu 


CU P-. 




4-1 H 


d 


CD H ID i— 1 


•X3 CO 




ci-i M a) 


CU 


3 H fl 


r-l 




o cu xl 


CO 


er i-i d 


4J 


u-i a 


P-i 4J 


d 


•ri o !*> a 


i-i d 


o o 


>> o 


o 


d r-l -H 


o cu 


■H 


4-J O 


o 


3 CO CU 4-1 


CM g 


d 4-> 

O «! 


•ri x; CO 

rH 3 CU 


60 


^ 3 3 

o cu cr 4-> 


d 
d h 


•H 4J 


•rl 1-1 


d 


4-J -r-l -r-l -rl 


o cu 


to d 


xi d a 


•r-l 


4-4 d 4-1 


•M > 


to a) 


cd O XI 


4-1 


CO -r-l J3 CO 


CO o 


cu g 


r-l CU 


•H 


CO 4-1 d -o 


•rl O 


CO 3 


•M d O 


O 


cu d d H u 


> 


(0 o 


WOO 


•r-l 


o cu o to 


O 60 


n ° 5 


> -H t-r 


>~" 


O T) -r-l ^ O 


i-i d 


P< Q 1 


< 4-1 P-I 


w 1 


<; h j xi « 


CM -M 











4-1 


cu 










o 


J3 




1 




CO 


cu 


4-J 




•rl 




cd 


•r-l 




a 




:* 


XI 


M >% 




•rl 


° 


cu 


o 


CU r-i 




4-1 


cd 


4-1 4-1 




x: -h 




u 


T-l 


d cd 


u 


4-1 H 




<d 


u 


CU l-l 


cu 


o cd 




Pu 


cu 


CO o 


X! 


d 






4-1 


a cu 


4-1 


TJ "rl 




Xi 


•rl 


O J-l 


o 


d xi 




4-> 


rl 


o o 




cd ij 




•rl 


O 


o 


cu 


o 




& 




T3 


E 


r*. 






d 


CU cu 


o 


4-1 CO 




d 


O 


e e 


co 


CU 'rl 




cu 


CO 


!-i O 




•rl 




4-1 


•rl 


O CO 


d 


r< d 




4-1 


U 


CM 


cd 


ri cd 




•rl 


cd 


d i-i 


Xi 


cd x; 




rl 


&. 


•rl o 


4-1 


4-1 




& 


E 


CM 




4-1 






o 


4-1 


U 


cd >. 




CU 


o 


cd cu 


CU 


xi u 




x: 




X! 4-1 


a 


4.J CU 




4-1 


r-l 


4J cd 


cd 


> 






1-1 


•rl 


pu 


CU o 




O 


cd 


CU !-l 




r-l O 




4J 




S-i &. 


S 


XI CU 






4-1 


3 O 


o 


•n l-i 




r-l 


o 


■M U 


M 


CO 




cd 


c 


cd P-. • 


CM 


CO u 




r-t 




d rx x> 




O "rl 




60 


4-1 


60 Cd cu 


>> 


C4 CU • 




<b 


3 


•rl M 


u 


x; co 




4-1 


X) 


CO CO 'rl 


cU 


CO 4J 4-J 




d 




•H 3 


> 


•rl d 




•r-l 


CU 


r^ CT 


o 


o CU 






E 


XI >. CU 


u 


u u to 




I-I 


O 


r-l U 


cu 


H cd 




r-l 


CO 


4J XI 


u 


cu 




cd 




d cd co 




r-l H 






rl 


CU XI -H 


CM 


• X) -rl 




4-1 


o 


3 u d 


o 


rS Cd CU 




O 


CM 




VJ W x: 




d 




O P. o 


4J 


O O 4J 






T3 


O -rl 


r-l 


4J > 




cu 


CU 


X) E 4-1 


3 


Ctf CS J-l 




u 


•rl 


14 cd 


O 


d 4-i o 




cd 


CH 


1-1 O 4-1 


•r-l 


to 






•H 


o >4-i d 


CM 


■rl 11 CO 




>, T3 


4-1 CU 


CM 


CO )-l (J 




CU 


o 


IS rl £ 


•r-l 


O O 




X! 


g 


60 (d 3 


X) 


"O £ U 




H 




•H d o 




d cd 






0) 


4J o O 


CU 


cd cu to 






x> 


CO -H XI 


w 


M 'rl 




• 




CU 4-1 


Q 


!-i cd 4J 




pi 


c 


> P~ d 


E 


ai co 




Pk 


rt 


a o cu 




P.. 4J CU 




u 





•rl 4-1 


CU 


cd d > 








CO 4-1 


rJ 


P. cu C 




44 


e 


CU "rl -r-l 


cd 


CO *rl 




d 


-d .C V4 




cu d 




cu 


o 


■u H & 


CD 


x: o d 




CO 


CM 




4-1 


4J o Cd 




CU 




4-> d 


d 


x: 




u 


CU 


id • cu 


•rl 


XJ XI 4-1 




P.X1 


x: -w x: 


S-l 


4-J CU 






H 


■u d & 


P. 


O d CO 




cu 




cd 


rJ 


XJ 60 4-J 




x: 




r>^ p-x; 


CU 


•m d 




4-1 


• 


r-l .rl CJ 


C) 


M to cd 






& 


d o m 


d 


o a. 




60 


u 


O "rl Cd 


•i-i 


44 60 -rl 




r; 


o 


4-j a) 


CM 


d cj 




■r-l 


IW 


cu W CO 




tO -r-l -rl 




CO 




rl 13 CI 


4-J 


d d 4-j 




3 


d 


•rl PC 1-1 


rt 


O -rl U 






o 


3 


x: 


•rl cd cd 




CU 


•H 


CfCl >, 


4-1 


4-1 4-1 P. 




•o 


4-J 


cu td cu 




■rl Xl 




a) 


cd 


pH > 


4J 


XI O r4 




B 


4-1 


3 M 


u 


d o 






a 


4-1 o 3 


cd 


o d cm 




cu 


cu 


Xi -rl CO 


CM 


8. >> 




r-l 


p 


60 4-1 






cd 


3 


■rJ M X! 


CU 


C 3 r-l 






o 


E cd o 


X! 


o cu 




CO 





P- 3 


4-1 


4-1 r* 




d 


XI 


E E 




CO d -H 




.O 




M cd 


'4-1 


XI Cd r-H 




CO 


cu 


O c-i 


o 


d xi 




•r4 


1-1 


4-i E o 




cu d cu 




rJ 


D 


O CM 


CU 


P- CU M 




cd 


4J 


r-l M 


4-1 


CU 4-1 o 




a. 


cd 


Cd C|-| rrj 


o 


X) 4-1 g 




E 


d 


d d 


d 


cd 




O 


60 


o x) cd 




xi xi 




o 


M 


'r-i CU 


cu 


o co d 






CO 


4-14-10) 


,M 


3 M co 


CO 


cu 




P-'rl 14 


cd 


E o 


cu 


co 


4-1 


o o o 


E 


4-J CU 


4-1 


cu 


d 


•M 4-1 




4-1 CJ CO 


o 


x: 


cd 


d i-i o 


cu 


3 cd cd 


d 


H 


ex 


<; cu cd 


3 


Xl 4-1 o 


4-1 












o 












o 


• 




• 


• 




fn 


r-l 




CM 


o 





25-90 



ordinarily attends the disclosure of confidential information, particularly 
the risk from compulsory disclosure. This form of harm, should it arise, 
usually follows publication or the dissemination of information when all 
contact with the subject is terminated. Indeed, given the high residential 
mobility of participants in much behavioral science inquiry, participants 
are not easily followed nor located. 

The comparisons provided in the preceding discussion and in Chart I 
call attention to forms of documentation and protection that would seemingly 
be essential to documenting informed consent yet are not provided in current 
federal regulations (45 CFR A6) . Although some of these could be added to 
the present form of "written informed consent" others when added would trans- 
form it into either a testimonial or warrant document or to a formal agree- 
ment, signed by both parties. Moreover, it should be clear that there are many 
different combinations possible of the elements in Chart I and the elements 
of information necessary to being informed. The specific concept to be 
applied to the form of documentation is of the barest consequences. 

Comparison also will make apparent that the currently approved regula- 
tions for documentation are balanced in favor of pro viding pro t ection for 
investigators rather than for participants . They also leave both partici- 
pants and investigators vulnerable and unprotected in matters of confiden - 
tiality of information . The investigator testimonial form moves toward 
fuller guarantees for both participants and investigators, though on 
balance it may protect more the rights of participants. Without more ade- 
quate, legal protection against misuse and explicit provision for provision 
against compulsory disclosure, however, all forms of documented consent make 
both participants and investigators vulnerable to harm through the disclosure 
of information. It is to this matter — the legal protection of confiden— 

25-91 



tiality — that we shall turn in Sections III and IV. 

Feedback on Procedure and Participant Sati s faction . 

Some bio-medical and behavioral science inquiries make provision for 
feedback from participants on the procedures used and their satisfaction 
or dissatisfaction with being a participant either during the procedures 
or at their termination. The question should be asked whether feedback 
should ordinarily be required as part of any research procedure. There 
are persuasive arguments in its favor but some to the contrary as well. 

Feedback can be of considerable utility to both participants and 
investigators. Investigators may well learn how to reduce risk from harm 
or how to increase participants' benefits by systematically eliciting feed- 
back. The more exploratory the procedure or the greater the risk of harm 
from its use, the more Institutional Review Boards should consider making 
feedback an essential element of "procedure." Such feedback should be 
refused whenever there is reason to believe that knowledge of it will make 
corrective action possible to protect the participant from further harm or 
to "undo" harm. 

Yet, there are reasons why it should not be required or even to 
prohibit its use. Investigators, for example, may seek feedback to "cool" 
participants from legitimate complaint or they may use it in other deceptive 
ways. Even though investigators do not intend these effects, whenever 
feedback has a reasonable likelihood of doing so for a reasonable number of 
persons "at risk," its use should be prohibited or circumscribed so as to 
avoid effects that are not in the interest of the participant. 



25-92 



Participa nt Rights to Informatio n 

There are important and largely unexplored issues about the rights of 
participants to confidential or other information that has been secured from 
them by informed consent for purposes of scientific inquiry. Apart from such 
rights as Federal law provides — e.g., were information given to a govern- 
ment sponsor of research, a participant has the right to review all informa- 
tion that retains unique identification and to correct the record — the 
issues are far fron clearly formulated, much less resolved. There are many 
difficult questions that will arise in discussing the matter of participant 
rights to information and we shall not review them here. For example, given 
the right to review information that is uniquely identified and to correct 
that record, is it reasonable to conclude that subjects can correct many 
matters of observation and recording that refer to their behavior, attitudes, 
or other research investigator recorded information, or does it apply only 
to those items of information that can be validated independently of the 
subject's correction? 

Matters of correcting research records apart, participan ts should have 
a_ right to request _a copy of a ll information w here unique identificat ion is 
retained so long as its disclosure doe s not i nvade the p rivacy of other rights 
of any others referred to in that record . Both the proposed procedures for 
protecting the identity of human subjects of DHEW (42 CFR 2a7:(b)) and of 
LEAA (28 CFR 22.23: (4)) provide for the release of confidential information 
with the consent of the participant, but they do not unequivocally grant 
the participant the right to review or request a record of all information 
that is retained with unique identification. There perhaps are some limits 
on the extent to which participants may request information that remains 
uniquely identifiable as, for instance, were there a strong presumption 

25-93 



that knowing it would cause the person considerable harm. A more difficult 
question, however, is whether the participant is entitled only to that 
information given with the participant's informed consent or to all informa- 
tion on the participant in the records in a uniquely identifiable form. Often 
in behavioral science studies, information is secured on transactions among 
persons or corporate actors. Where the informed consent of others was 
involved in securing the information, a participant's rights are less 
clear. On the one hand, any confidential information, regardless of its 
original source, is potentially damaging on disclosure and any participant 
should have a right to know what is in the record, but on the other hand, 
persons who gave such information with a promise of confidentiality have 
a right to have the information kept confidential. The teacher who provides 
information on pupils or the wife who provides information on the husband 
(and vice versa in the above illustration) create a special case where 
rights to information are hopelessly intertwined and where the knowledge 
that each would have access to all information provided by the others 
obviates all forms of research except that of the public forum'. In short, 
without limits on the right of participants to information that is unique]. y 
identified, a right to request all of the information that is a matter of 
record could well have a chilling effect on all research where confidential 
information is secured about a participant from anyone other than the 
participant. 

The Rightf- of Investigators to Information Secured on Promise of Confidentiality . 

Whether and what rights investigators have to information that has been 
secured by informed consent without explicit forms of legal protection is far 
from clear. They obviously possess those rights in the information that are 

25-94 



matters of informed consent and contract — to use it for the explicitly stated 
purposes of research and all related interventions explicitly provided for 
as matters of agreement. But that right is not exclusive, subject as already 
noted to rights of those who provide the information. Where a_ pro mise o_f 
confi dentiality is explicitly provided for in obtainin g informed consent , 
investigators s hould have the exclusive right and d uty to protec t the informa- 
tion subject onl y to -the ri ghts of those who consented to give the inform a- 
tion . They, of course, have a right to use that information only uo long 
as they intend no harm in using it. 

Transfer of Confidential Information to Other Investigators . Among the 
many other matters at issue in subject and investigator rights is the ques- 
tion of whether an investigator may share information that has unique 
identifiers with other investigators to whom consent was not originally 
given. This matter arises where confidential records have been obtained 
and where there was no prior agreement to use them for a given inquiry. 
Indeed, any more or less general provision giving an investigator the right 
to permit access to the confidential information for purposes of research, 
other than that for which informed consent is being secured, will generally 
be so vague and incomplete as to lack the very elements considered basic 
to informed consent (Goldstein, 1969). Should this mean then that except 
where informed consent is originally secured for one or more specific 
projects, or where a subject is subsequently contacted and informed consent 
secured for each subsequent project using the information, no other research 
access should be permitted? That rule would seem to be unusually burdensome, 
given the high cost of much behavioral science inquiry and the cost attendant 
upon building up time series from individual data. 

Where the risk from subsequent use o_f the information provided by inform ed 

25-95 



consent can arise solely from its p ublic disclosure , and w here there is 
legal protection against com pulsory d isclosure and strong sanctions against 
its inisuse, i nvestig ators should h ave rights to transfer confidential 



information provided t he s ame prot ection is afforded on tra nsfer . That 
right should extend to its use not only for similar and related projects 
but to unrelated ones that in the judgment of the investigator or/and 
institutional or government sponsors are in the public interest of free 
scientific inquiry. The transfer of such information should not be left 
entirely to custom, however, and should be protected through federal regula- 
tion. The proposed regulations by LEAA set forth the major elements for any 
information transfer agreement (28 CFR 22.23 & 22.26). 

The major elements of a request for transfer of_ informa tion should 
include the following (28 CFR 22.26: (b)): 

. . . the genera], objectives of the project for which informa- 
tion is specifically requested, and specifically justify the 
need for such information in .identifiable form. The request 
shall also indicate and provide justification for the conclu- 
sion: 

(1) That conduct of the project will not, either directly 
or indirectly, cause legal, economic, physical, or social harm 
to individuals whose identification is revealed in the transfer 
of information. 

(2) That conduct of the project as designed would not be 
expected to have a detrimental effect on overall future research 
or statistical efforts of the Federal or State government. 

The information transfer agreement should be formally executed and 

should make provision for at least the following minimum information 

stipulated in the proposed LEAA regulations (28 CFR 22.24): 

(a) Information identifiable to a private individual will be 
used only for the purposes stated in the transfer agreement. 

(b) Information identifiable to a private individual will not 
be revealed to any person for any purpose except where 

(1) The information has been included in research findings 
(and/or data bases) and is revealed on a need-to-know basis 



25-96 



for research or statistical purposes, provided that such 
transfer is approved by the person providing information 
under the agreement, or 

(2) is authorized under 22.24(e). 

(c) Knowingly or willfully using or disseminating information 
contrary to the provisions of the agreement, shall constitute a 
violation of these regulations punishable in accordance with the 
Act. 

(d) Adequate administrative and physical precautions will be 
taken to assure security of information obtained for such pur- 
pose. 

(e) Access to information will be limited to those employees 
or subcontractors having a need therefore in connection with 
performance of the activity for which obtained, and that such 
persons shall be advised of, and agree to comply with these 
regulations. 

(f) Project plans will be designed to preserve anonymity of 
private persons to whom information relates, including, where 
appropriate, required name-stripping and/or coding cf data or 
other similar procedures. 

(g) Project findings and reports prepared for dissemination, 
will not contain information which can reasonably be expected to 
be identifiable to a private individual. 

(h) Information identifiable to a private individual (obtained 
in accordance with this agreanent) will, unless otherwise agreed 
upon, be returned upon completion of the project for which ob- 
tained. 

These proposed regulations clearly provide that all sanctions, including 

fines, obtain for all investigators and their employees who have access to 

information by transfer agreement, whether the provisions of protection 

from compulsory disclosure also apply is not altogether clear in the LEAA 

regulations but were a transfer agreement to be legally authorized under 

the proposed HEW regulations for protecting the identity of subjects (42 

CFR 2a), such protection would seemingly apply to the transfer agreement 

as well since: "The protection afforded by a Confidentiality Certificate 

is permanent with respect to subjects who participated in research during 

25-97 



any time the authorization was in effect" (42 CFR 2a8:(c)). 

Cooperative Activities to Develop Confidential Information . The DHEW 
regulations currently in effect make provision for cooperative activities 
"... which involve institutions in addition to the grantee or prime 
contractor (such as a contractor under a grantee or a subcontractor under 
a prime contractor)." (45 CFR 46.16). They further provide that: "If 
in such instances, the grantee or prime contractor obtains access to all 
or some of the subjects involved through one or more cooperating institu- 
tions, the basic DHEW policy applies and the grantee or prime contractor 
remains responsible for safeguarding the rights and welfare of the sub- 
jects" (45 CFR 46.16). 

The obligation which falls on the grantee or prime contractor to 
safeguard the rights and welfare of subjects will hardly guarantee subjects 
protection when, as is now the case, there are no stringent sanctions against 
unauthorized disclosure or misuse of the information. Prime contractors 
now have no specific sanctions available to deter misuse nor are there 
provisions against compulsory disclosure. One must grant that it is 
surprising that despite considerable subcontracting in behavioral science 
research, few situations have arisen where sanctions are appropriate. Yet 
there have been somewhat more situations where protection against compulsory 
disclosure seemed essential, as in the negative income tax experiments in 
New Jersey. 



25-98 



II.I. CONFIDENTIALITY 

Confidentiality is the communication in confidence of private matters. 
It involves a fiduciary responsibility — a pledge or promise to hold privi^ 
leged or otherwise keep secret communications about private matters. It 
also involves a promise of protection and implies the capacity to keep mat- 
ters communicated in confidence from disclosure by the confidant or others. 

Both fiduciary and protection obligations come into question when in- 
vestigators pledge confidentiality in behavioral science research. Investi- 
gators often are caught in a dilemma that a pledge of confidence is necessary 
to secure valid and reliable information but they lack the legal right of 
privileged communication and the sanctions of tort law are inadequate pro- 
tection against misuse or disclosure by others to whom the responsibility 
for confidentiality must necessarily be entrusted. Viewed another way, many 
investigators are well aware of the fact that organized behavioral science 
inquiry involves a chain of confidence and they are personally and morally 
committed to maintaining that confidentiality even at the risk of loss to 
themselves. They lack, however, sufficient knowledge about their vulner- 
ability to disclosure and the weakness of the protection afforded them in 
promising confidentiality. Many behavioral science investigators, therefore, 
enter into a pledge of confidence in good faith but their faith rests on a 
weak societal foundation whose dimensions they know not. 

Requirements for Pledge of Confidentiality . On its face, pledges of 
confidentiality should not be extended by investigators for information 
provided them except when it concerns private matters. What is a private matter 
is defined at law; yet it hardly bears noting that most people are unaware 
of what are private, privileged and public matters at law; ultimately the 



25-99 



courts will decide if a particular matter at issue is private, privileged 
or public And that is of little help either to investigators or their 
participants whose informed consent is being elicited. There is a reason- 
ably strong basis for argument that communication between investigators 
and subjects should be privileged whether or not there was an express or 
implied promise by the investigator to the participant that the information 
provided will be treated as confidential. First, most subjects become ac- 
cessible to inquiry because investigators approach them to seek their coop- 
eration; while confidentiality may be promised, when it is not, unless they 
are expressly advised that what they say can be told to anyone, they invari- 
ably assume confidentiality. Moreover, even when they volunteer to partici- 
pate, they ordinarily assume that implies confidentiality as well. Most 
participants, moreover, are unaware of what is implied in confidentiality 
even when they are advised of some risks. Likewise, the method used to ac- 
quire information may not make it easy to promise confidentiality since 
others may be privy to the same matters. That should not leave the parti- 
cular instance from protection, however. I note parenthetically that a par- 
ticipant may disclose the same information to a friend, a journalist, and a 
behavioral scientist with the expectation that all will regard it as confi- 
dential and without any explicit promise of confidentiality, a complicating 
feature in much behavioral science inquiry. Unlike much biomedical inquiry, 
where a particular piece of information is unlikely to become available ex- 
cept as a consequence of the research intervention, behavioral scientists 
seek to acquire information to which many others are privy but in the ex- 
pectation that it will he regarded as confidential. Needless to say, full 
protection of information on participants by investigators is possible when, 



25-100 



and only when s the investigator and the participant uniquely share a given 
matter as "private". That is, we suspect, rarely the case so that partici- 
pants are not fully protected against disclosure of Information they share 
with investigators hut protected only from disclosure by that source. From 
the perspective of private persons (both person and corporate actors) in 
our society, whatever they regard as private matters in giving confidence 
should he treated as a confidential matter in research regardless of the 
status of law, i.e., there should be an absolute privilege. Yet it is un- 
likely that the society will grant an absolute privilege since that would 
cover a confi-dence that one was about to commit a heinous crime such as an 
assassination of a public official. While at law such relationships may 
need to be conditionally privileged, the conditions should be relatively few 
if one is to recognize that most people "respond to a guarantee of confi- 
dentiality with thei r definition of what is confidential and we should or- 
dinarily be prepared to protect those matters as well since they are im- 
portant to them. 

Yet, from the perspective of the public interest, there may be other 
reasons why investigators should not be given a simple license to promise 
confidentiality in exchange for information or compliance with an interven- 
tion in research. The government and institutional sponsors, as well as 
investigators, may wish to categorically prohibit or circumscribe the in- 
vestigator's right to pledge confidentiality for public behavior. Seemingly 
there is no reason why they should not. Yet often access to public behavior 
must he obtained or the behavior of public officials or employees is being 
investigated and these become possible only through a promise of confiden- 
tiality. 



25-101 



Conditionally, it might be argued that investigators have no right to 
promise confidentiality for information on illegal conduct. Were that so, 
much Behavioral science inquiry that leads to an understanding of illegality 
and its regulation by law and custom would not be possible. Indeed, it 
might be argued that conditions on confidence or privileged communication 
in research should generally restrict neither substance nor procedure in 
inquiry unless it is so clearly and substantially damaging to the public 
interest as to be stipulated by specific, exclusion. 

Given full protection of confidentiality in research, questions of its 
use might well be left unregulated by other than the investigator were it 
not for two matters. First, the capacity to protect confidentiality is 
never absolute and therefore damage may result. For that reason some as- 
sessment of potential harm and its seriousness and the capacity to protect 
confidentiality must come into consideration in permitting investigators 
to promise it. Second, the privilege of communication opens the door to 
abuse of privilege. Investigators may seek to acquire more information 
than is necessary to the particular inquiry to meet their own needs or re- 
quirements rather than those of the participants or the larger public in- 
terest in research. Some regulation is necessary to protect both private 
persons and the public interest against unnecessary intrusion into private 
matters. In practice, it will be difficult to regulate investigators ex- 
cept by prohibition of a pledge of confidence given the difficulty of de- 
ciding what information is essential to the inquiry on the one hand and the 
fact that extraneous confidential information is offered by participants 
on the other. To refuse to protect participants who offer what is regarded 
as extraneous information seems arbitrary and not in their interest, given 



Z5-102 



what for many must be a very limited understanding of what lies within and 
what lies without the approved domain. 

The major means for regulating the promise of confidentiality is to 
require that it not be utilized when there is some alternate -means that 
will permit the acquisition of information while protecting the anonymity 
of participants from everyone, including the investigators. Absent that 
possibility, whenever possible, restriction should be imposed on the number 
of persons that may be privy to private matters and the length of time 
that uniquely ideiitified information is accessible for disclosure. Yet even 
these matters are vulnerable as we have already noted, e.g., to testimony 
when physical evidence is destroyed but testimonial evidence remains viable. 

In brief, then, a pledge of conlidentiality seems essential whenever 
(1) participants regard it as a condition of providing essential information, 
whether or not at law it Is a private matter; C2) whenever the information 
cannot be elicited and usefully analyzed with the participants remaining 
anonymous; and whenever the pledge is not clearly substantially damaging 
the collective or public interest. Correlatively, no promise of confiden- 
tiality should be given or required when there is no possibility for personal 
harm on disclosure of information or when the objectives of the study can 
be accomplished by strictly anonymous procedures. Whenever one restricts 
the chain of information, however, a promise of confidentiality must be 
given if there is risk of harm, though in the interest of protection from 
disclosure one may want to impose restrictions on acquisition of uniquely 
identifiable information by members of the chain. 

Risks in Protection of Confidentiality . We have noted on numerous oc- 
casions that the major harm from behavioral science inquiry follows from 
the fact that information is socially powerful and damaging. When information 

25-103 



is used on priyate matters it may damage the interests or welfare of those to 
whom the information obtains. Now since the major risk from disclosure comes 
either from unauthorized or illegal misuse or from compulsory disclosure, 
harm will rarely occur if one is protected from these sources of disclosure. 
We turn to guaranteed forms of protection from these forms of disclosure in 
Section IV. 

There are, nevertheless, some other problems related to protection from 
disclosure. Some of these arise from the nature of analysis and publication 
of results. For the most part, the analysis of data and publication of re- 
sults in behavioral science inquiry is interested in aggregative information. 
There is thus little risk from disclosure apart from the eliciting and early 
processing of information unless for some reason the data are to be retained 
with unique identifiers for subsequent analysis. There are conditions under 
which one may do so apart from an interest in longitudinal or panel studies 
where uniquely identified individual or corporate actors are followed for 
extended periods of time. Some behavioral scientists have an interest in 
deviant case analysis. Correlation is always far from perfect and explana- 
tion is often incomplete and unsatisfactory. By returning to uniquely iden- 
tifiable information with the opportunity to either add information to it 
or undertake a different form of analysis is extremely useful in trying to 
understand one's failures at explanation and in seeking leads for future in- 
quiry. For these reasons the simple notion that one should eliminate unique 
identifiers as early as possihle and not retain them so that information can 
be recaptured in terms of them seems often unwise. 

The case history and case study techniques of inquiry and reporting are 
perhaps more vulnerable to disclosure of information than other forms of 



25-104 



analysis since they rely 'very much, on retaining at least -minimal unique iden- 
tification for the information- — separation of identifiers from the informa- 
tion but a capability for matching them. Both individual and corporate actor 
identities are at stake in such inquiry and in -much, evaluative research. 
Since often considerable information is published on a case basis, efforts 
are made to protect confidentiality by alteration of identifying character- 
istics, etc., but unless the investigator has bean able to keep confidential 
the identities of all participants, alteration of identifiers may be a weak 
foi.m of protection, given the limited knowledge one may have about what in- 
formation others possess that might permit them to make a unique identifica- 
tion. Thus the more one disaggregates information or publishes particular 
case information, the greater the risk of disclosure. This is more likely 
to be true, for corporate than individual actors as previously noted. 

Types of Inform ation to be Protected . There are four principal types 
of information that must be protected where a promise of confidentiality 
has been extended. Both participants and investigators must be protected 
against breach of confidentiality and both informant and investigator types 
of information must be protected if confidentiality is to be protected. 
The four types of information requiring protection are: CNejelski and Peyser, 1975: 
B-37-41) : (1) identity of the research subject; C2) contents of communica- 
tions with a subject; Q) direct observations of subjects; and (4) work 
product. 

(1) Identity of the research subject . The identity of individual and 
corporate actors depends both upon unique identifiers or a combination of 
identifiers that result in a unique identification. We have already dis- 
cussed the status of unique identifiers such as fingerprints, voiceprints, 



25-105 



and photographs in behayioral science research and others may arise in bio- 
medical inquiry. But an equally likely possibility for obtaining uniquely 
identified information is that a number of identifiers permit unique identi- 
fication. There is, for example, in a given instance only one person who 
is of a given race, sex, age, income, and first name at a given apartment 
at a given address. I note that even fewer of these may be all that is nec- 
essary for a given unique identification in a given case, e.g., when there 
is a one person household at a given address that is the only household at 
that address. The problem of unique identification from a set of identifiers 
is problematic in the law of evidence so that what is uniquely identifiable 
information for behavioral scientists would not be regarded as meeting a 
sufficiency test for conclusion beyond a reasonable doubt at law. Yet the 
social world is not built on proof beyond a reasonable doubt, and much harm 
is done from the use of identifiers to converge toward unique identification. 
A formal system of law exists to protect against the doing of harm based on 
these forms of unique identification. Where unique identification is a pos- 
sibility either because of unique identifiers or through the convergence of 
identifiers, and a promise or need for its protection exists, that informa- 
tion must be protected whether for an individual or a corporate actor. 

C2) Contents of communications with a participant . Identifiers do 
not exist simply in the form of characteristics of individual or corporate 
actors but in the contents of their communications. Not uncommonly, knowing 
what specifically was disclosed to the investigator can lead to a unique 
identification, since that person, and that person only, was privy to that 
information or is responsible for it. As Nejelski notes, one reason for 
protecting the contents of communications from a source is the practical 



25-106 



difficulty "...in distinguishing between the information per se and informa- 
tion that would reasonably reveal the identity of the sources" (ftejelski 
and Peyser, 19J5;B-39l. We would think that the protection of the contents 
of communications might be more important for corporate than individual 
actors, but in the aggregate more individual than corporate actors might be 
so protected given their relative numbers in a population of participants. 

C3) Direct observation of participants . Access to individual and 
corporate actors to observe their behavior, including illegal behavior, is 
essential to some kinds of inquiry. Those observations must be protected 
even when it has not been possible to obtain prior consent. Indeed, under 
some circumstances, as previously noted, one does not obtain consent for 
the observations but nevertheless records them. Thus the interviewer may 
be directed to record the race and sex of a person without direct inquiry or 
to record whether or not the subject appeared to be truthful or cooperative, 
and so on. Such observations are essential to improving the validity and 
reliability of behavioral science studies and should be protected. 

C4) Investigator's work product . During the course of a research in- 
quiry, much work product is produced that potentially would permit the dis- 
closure of confidential information. Such product will never be disseminated 
and every effort will be made to insure that any final work product is free 
of the possibility of disclosure. Yet during the inquiry some work product 
may inevitably permit disclosure. We might, for example, have a computer 
output prepared that disaggregated information to a given level on the pre- 
sumption that there is sufficient numbers of cases to do so and provide con- 
fidentiality. Yet when the output is examined, it is clear that unique iden- 
tification is possible. One might not necessarily destroy that output im- 
mediately since there are ways to protect for that particular case (by simple 

25-107 



recombination, fpr example). • (^uite obviously since one cannot always guard 
in advance against tbe generation of work product that would permit dis- 
closure of confidential information, it requires protection. 

Types of Matters VThere Disclosure Is Harmful , Social harm is a conse- 
quence of the actions of persons or collectivities against those who are 
harmed, yhile harm -may occur when it is not intended as well as when it is 
intended, the disclosure of information results in harm whether or not it 
is intended whenever its consequences are harmful. At tlmts the same in- 
formation causes both harm and benefit. Unfortunately, one cannot always 
predict harmful or beneficial consequences and social prediction in many 
areas is at best subject to considerable error. What we roust rely upon, 
therefore, is some notion of the potential for harm resulting from the dis- 
closure of any piece of information, while recognizing that the disclosure 
of any information on private matters may be Loth harmful and beneficial. 

We shall concern ourselves here only with those kinds of matters fcr 
which confidentiality must be promised because of their high potential for 
harm and low potential for benefit on disclosure. These are all matters 
on which behavioral science inquiry seems justified on grounds of public 
interest in the inquiry. They are only briefly ennumerated since the argu- 
ment should be clear from their ennumeration: 

1. Legal matters where the disclosure of information leads to a legal 
proceeding that is harmful to individual or corporate actors, including in- 
formation from legally privileged communications and on violations of law 
leading to criminal, civil or administrative proceedings and sanctions. 

2, Violations of organizational rules and regulations for which there 
are organizational sanctions against members. 



25-108 



3. Conduct that is stigmatized within the larger society or any group 
therein who have power to stigmatize conduct. The consequences of stigmati- 
zation may range from social exclusion and isolation to subtle forms of re- 
jection and discrimination. Their potential for harm in any case may be 
considerable since stigma can affect the life-chances of individuals. 

4. Pe rformance and achievement, m e asures or any related item that or- 
dinarily is considered a private matter and its disclosure is regarded as 
violating the privacy of the actor. 

5. Official or organ izational s ecrets whose disclosure harms the parti- 
cipant by altering advantages or reducing benefits; their disclosure can be 
the cause of considerable damage. 

These are but some of the major types of matters to which behavioral 
scientists can and do become privy in the course of inquiry and confidential- 
ity must ordinarily be extended and protected if research on them is to con- 
tinue. One must take note of the fact that considerable behavioral science 
Inquiry has occurred on all of these matters without disclosure having be- 
come problematic. This is in large measure due to the integrity to all mem- 
bers and employees of the research community respecting fully any confidence 
given and to the openness and tolerance of the larger society toward that 
research community in withholding strong support for compulsory disclosure 
of such matters. 

Exclusion of Protection for Information Acquired in Research . Both to 
constrain investigators from unduly intruding upon the privacy of persons 
whose confidence is obtained and to sustain the public interest in and right 
to information, some types of information, it is maintained, should be ex- 
cluded from a promise of confidentiality and from protection if acquired. 
We have already dealt with the question of what might be excluded from a 

25-109 



promise of conf idertiality and deal here with, the question of whether there 
should be specific exclusion of any matters from legal forms of protection. 
The following -matters that have been suggested for exemption from legal pro- 
tection we shall maintain should he covered. 

First, it is argued that information that lies outside the scope of the 
project should not be protected if the investigator elicits it and it is 
supplied by the participant. To leave that information unprotected, wa sug- 
gest, is to punish participants for investigator misdoing. 

Second, it is argued that the information participants provide that lies 
outside the investigator's promise of confidentiality should be left unpro- 
tected. Elsewhere we have suggested that it is difficult for participants to 
maivtain those distinctions or even to perceive them. Moreover, they are 
not altogether in control of the activity which t?ke<=; place when the investi- 
gator is present, particularly when consent has been given to enter a pri- 
vate place. Finally, persons may be damaged by such means as "guilt by as- 
sociation" or mere presence in a situation; the disclosure of such matters 
is unwarranted. We would suggest tliat particpiants should be fully protected 
in providing information once consent is given though investigators should 
be obligated to constrain them from offering any information that may be 
damaging to either party and that is not essential to the inquiry. 

Third, any investigator acquires information even when subjects refuse 
or for some reason fail to participate or comply with what it is that the 
investigator seeks by inquiry, including a refusal to grant informed consent. 
We have already noted that eyen a refusal to grant informed consent may be 
damaging if known such as the disclosure of refusal of a member of a stigma- 
tized group. Within total institutions or any institution that provides for 



25-110 



the compliance of their members with the inquiry, knowledge of failure to 
comply may he damaging and should be. protected. Even within a total insti- 
tutional setting when informed consent is obtained, such as in a prison, if 
the prisoner is released to appear for interview, for example, and fails to 
appear as agreed upon, should such information be protected on inquiry from 
the warden? These are no simple matters, but there should be no categorical 
exclusion of them from protection since in many, if not most, instances such 
protection should be afforded if behavioral science inquiry is to be sustained. 

Indeed, one might make a reasonably compelling case that all information 
except that relating to potentially great harm at some future time should be 
protected from disclosure whether or not it was made a matter of confidence 
in securing informed consent, The reasons for this argument are several, 

1. It is difficult to prove matters of what both subjects and investi- 
gators intended and understood and what and what not are the specific matters 
to be protected by agreement. More harm may be done from just such misunder- 
standing than would be done by full protection. 

2. Where information is acquired because investigators exceeded their 
authority, participants should not be punished for their failures. 

3. Since there is considerable variability among the participants in 
many studies in their levels of education and other skills, it is unreason- 
able that they will be able to specifically monitor what can and cannot be 
said or done with a given agreement. 

4. Finally, absence of protection would likely generate patterned eva- 
sion on the part of investigators; much more attention would be given to ex- 
cluding from the record information that was acquired lest it be considered 
extraneous and jeopardize the collection and retention of other information. 



25-111 



While suck eyasion wight well he desirable in that it protects such informa- 
tion from disclosure, as a matter of record, it is still unprotected testi- 
monial evidence. It may- well affect the quality of information also, parti- 
cularly since such screening must often he delegated to employees whose train- 
ing in what to include and exclude is far from ideal. 

There are conditions under which investigators should have full protec- 
tion to keep information confidential that was acquired through explicit 
agreement in eliciting informed consent and that which was not even though 
the investigator is not in a position to protect against its disclosure. 
That is investigator warrant only that they keep confidential the Information 
as they acquire it hut not to guarantee anything other than that they will 
not he the source of disclosure. At the same time, some obligation falls 
upon the investigator to remind participants that others than the investiga- 
tor may he the source of disclosure. 

There is likely to be misunderstanding about what it is that investiga- 
tors agree to and can protect. Investigators can protect only the informa- 
tion they acquire from their being an agent of disclosure. They can offer 
reasonably adequate guarantees of that protection only when they have legal 
protection against its compulsory disclosure and strong sanctions to protect 
it from illegal or unauthorized misuse. Yet participants and others are 
all to quick to conclude that what is being guaranteed is protection from 
its becoming public knowledge, forgetting that the only protection afforded 
is that the investigative agent agrees not to be an agent for its disclosure. 
This is no simple matter in securing and protecting informed consent since 
participants are all too easily confused in such matters and some protection 
must be afforded in the face of their possible confusion. 

25-112 



Let us retux-n again to our elementary Human Subject model of a single 
investigator and single subject. There the acquisition of confidential in- 
formation can be such that if it never has been or will be communicated to 
anyone else, or if in fact only the investigator acquires it through his in- 
tervention on the subject, protection against disclosure is a likely event. 
Yet if both subject and investigator sbare the information, either may be 
the agent of disclosure. The subject may even do so By sharing it in other 
modes of confidence including other agents who have and can afford protec- 
tion. Yet in many situations where informed consent is elicited and confi- 
dentiality promised, it can apply only to the agent and disclosure from 
others is problematic. This is particularly true in some kinds of behavioral 
sciences research where the information derives from group settings or corporate 
actions. Whenever there is a third party to confidence; as is often the 
case in behavioral science inquiry, disclosure is possible without involving 
the agent investigator as the source of that harm. Paradoxically, he may be 
perceived to be the source of that disclosure when in fact he has been a pro- 
tector and has no way of demonstrating that he has not been the agent of dis- 
closure. The most that can be done in such instances is to demonstrate that 
there are no compelling or other reasons why he should have been its agent. 
Given the fact that disclosure is possible from more than one source in many 
kinds of behavioral science inquiry, it is incumbent upon investigators to 
advise persons from whom potentially harmful information is sought that the 
protection they offer'-- if legal or other protections are afforded— provides 
no guarantee against its disclosure by others. Clearly when investigators 
acquire confidential information in group settings or when third parties are 
present, they nave an obligation to inform that anyone present other than the 



25-113 



inyestigatpr is a potential source of disclosure (unless they are covered 
by the investigator's privilege, e.g., as employees). 

Types of Methods Presenting Special Problems for Protecting Confiden- 
tiality . We have observed many times that methods 'vary considerably in 
their capacity to provide anonymity in the eliciting process and analysis 
phases of inquiry. Here we shall focus on some special problems that be- 
havioral science methods present in eliciting and protecting confidential 
information. 

1. Techniques for t h e self -repor ting of behavior . These techniques 
include a wide range of tests Ce.g., achievement or performance measures) 
questionnaires and scales (e.g., items in a masculinity-femininity scale or 
personality test), and interviews, among others. The critical matter here 
is whether and how the technique is linked to unique identification. Where 
unique identification is possible, protection is especially critical, since 
self-reports of behavior have considerable evidentiary value, more so than 
might ordinarily be the case when they derive from an impartial inquiry such 
as research. Where self-reports of illegal or other damaging forms of be- 
havior are elicited, the government is obligated to provide legal protection 
if it sponsors the inquiry. Parenthetically, we might note here that the 
government assumes certain obligations if it sponsors inquiry for eliciting 
confidential information, not the least of which is an obligation to provide 
legal protection for that which js confidential and potentially harmful . 

2. \Direct observation and recording of behavior. Again the evidentiary 
value of such information is considerable, particularly when it has unique 
identifiers such as in audio-visual recording. Both information obtained 
from direct observation and from direct recording Ce.g., tape-recording or 



25-114 



Video-tapes) require special protection and special obligations to insure 
its protection. The. participant observer is an especially vulnerable tech- 
nique since the participant observer acquires information by virtue of posi- 
tion that -might otherwise be disclosed as confidence but at the same time 
has a special status in providing testimonial evidence — as observer and as 
scientific observer. Direct observation poses especially difficult problems 
where entire groups rather than individuals are under observation since they 
are especially vulnerable, as previously noted to the disclosure of informa- 
tion from a large number of potential sources. 

3,- Investigator intervention in social situations . When investigators 
intervene in situations and that intervention itself gives rise to confiden- 
tial information that is shared by all persons in the group Cas in guided- 
group interaction techniques of intervention) , the investigation has a special 
burden r whatever legal protection is afforded an investigator may be inade- 
quate to forestall disclosure of information, parcicularly since group pro- 
cesses of sharing information — rumor, gossip, e.g., — have their own dynamic 
elements. The problem poses a special moral or ethical dilemma since in 
these circumstances the confidential information is created by the research 
intervention and others become party to it because of the nature of that in- 
tervention. Indeed, but for the intervention, other parties might not be 
privy to the information. By way of illustration, imagine an experiment 
using guided^group interaction techniques with a group made up of alcoholics; 
under both interventions from the investigator or his agent or other members 
of the group, confidential information is disclosed and necessarily shared 
by all. We shall not reyiev in depth here the special problems that such 
group techniques pose but simply note that they have enormous power to induce 



25-115 



confidential Information that persons would not otherwise disclose; at the 
same time such processes haye a potential for doing harm to participants 
that cannot he predicted. Interventions that are particularly designed to 
elicit information as well as to produce a separate result— such as group 
therapeutic or interview techniques — require special examination because 
they are both a potential for harm and a potential for harmful disclosure. 
A. Informant and relational te chni ques . A surprising number of be- 
havioral science techniques are based on a model not only of self-reporting 
but of informant reporting. The subjects are asked to report on another 
directly, e.g., what did your mother do then? or to describe relationships 
that necessarily supply information on others indirectly, e.g., did your 
father and mother have a quarrel over that? We call special attention to 
the fact that any technique which elicits information on social relationships 
not only poses problems of eliciting informed consent as noted earlier, but 
special problems of protecting confidences that were not secured by informed 
consent. They pose special problems not only because consent was not ob- 
tained but they must qualify the extent to which anyone who supplies the 
information has a right to request its disclosure. The very nature of in- 
formation about relationships when they become implicated in a research in- 
quiry that developed it as an item of information is that it involves rela- 
tionships between the parties to it and the investigator who structured it 
as "relational information". Thus questions like; "do you hate your mother?"; 
"was your father working at this time?"; "how much education does your mother 
have?" and many more intimate questions than these provide information on 
persons related to the participants being studied or on their relationships. 
Whenever such information is elicited, investigators have a special obligation 



25-116 



to protect that information from harming the other parties as well as the par- 
ticipant who gaye informed consent from disclosure. 

We note in passing that Behavioral science research can provide complex 
problems of protection of informant information. Suppose, for example, one 
wished to study an informing process by investigating police use of inform- 
ants in law enforcement, securing the confidence of both the police and their 
informants. Without legal protection for confidentiality the study would be 
impossible, yet it must rank as a rather high priority in understanding an 
Important problem in the study of police practices and their effect on in- 
stitutions of privacy. 

5, Sociometric techniques . Sociometric techniques, as noted earlier, 
pose special problems in securing informed consent; they also pose pioblems 
of special protection since disclosure of information on any person in the 
network is potentially harmful to all others. Thus studies of delinquent 
gangs, gay bars, a military squadron, and similar phenomena pose problems 
of special protection of confidence. 

6. A case history 6f technique . Any study employing a case history 
technique that requires the retention of information that is uniquely iden- 
tifiable over a long period of data collection and analysis must be specially 
protected since it is more vulnerable to both unauthorized and compulsory 
disclosure. The use of techniques that preclude the complete and early 
destruction of identifying information require both that special precautions 
be taken to protect the processing and storage of information and that special 
forms of legal protection be available if disclosure of the information is 
potentially harmful, 

' The Need for Formal Punitive Sanctions as Protection. The more biomedical 



25-117 



and behavioral science inquiry is organized to include investigators and 
employees, each, of whom undertakes one or more specialized tasks, the more 
administrative control that must be exercised and the less likely profes- 
sional ethics, commitment, and self-regulation can be counted upon to pro- 
tect confidentiality. Net only must greater precaution be taken to pro- 
tect the confidentiality of information from those outside the research or- 
ganization who might seek access to it hut its unauthorized or illegal use 
by employees. A typical survey research study, for example, might involve 
more than a hundred different employees who could have access to confidential 
information. Others may also have access to it who are not employees, such 
as student trainees and assistants who volunteer their services in exchange 
for training. 

The more potentially harmful the disclosure of confidential information, 
the greater the obligation for its protection. Where serious harm could re- 
sult from its disclosure, investigators or sponsors must have access to for- 
mal sanctions for any unauthorized disclosure or misuse. The proposed LEAA 
regulations for the protection of confidentiality of identifiable research 
and statistical information make provision for sanctions. Section 22.29 pro- 
vides C28 CFR): 

"Where LEAA believes that a violation has occurred of Section 
524a, these regulations, or any grant or contract conditions 
entered into thereunder, it may initiate administrative actions 
leading to the termination of a grant or contract, commence ap- 
propriate personnel and/or other procedures in cases involving 
Federal employees, and/or undertake appropriate legal actions 
leading to imposition of a fine not to exceed $10,000 against 
any person responsible for violation." 

Any employee of any investigator hence is subject to a fine of $10,000 
if he/she in any way knowingly violates the protections provided for confiden- 
tiality of identifiable research and statistical information. Should 



25-118 



investigators seek, to safeguard confidentiality of identifiable information 
where persons who are not ordinarily employees might be giyen access to it, 
such protection is easily afforded by nominal appointment C$ a year appoint- 
ment, for example} as an employee. 



25-119 



IV, MINIMIZING RISK FROM DISCLOSURE OF CONFIDENTIAL MATTERS 

The main risk of harm in much, behavioral science inquiry stems from the 
disclosure of private -matters to which socially harmful responses are then 
made, We have pointed out that even the mere refusal to grant informed con- 
sent and its documentation can pose risks to participants when that simple 
fact is disclosed. For -many kinds of inquiry, moreover, uniquely identifi- 
able information results from the procedures required to accession subjects — 
they ordinarily are not volunteers— and from the. procedures fm acquiring 
and processing information. Such information should be protected insofar as 
possible not only if there is risk of harm but if the participants desire 
its protection for any reason whatsoever . Finally, ve noted that investigators 
often become privy to private matters that are not intended by the mode of 
accessioning participants or by eliciting procedures; acquiring information 
often is an unintended consequence of the necessity for gathering data in 
social situations that have a dynamic, life of their own. The disclosure of 
these unintended matters may also harm individual and corporate actors and 
the risk of their disclosure must be minimized if behavioral science inquiry 
is to continue as a vital form of scientific inquiry in the public interest--- 
an interest that is minimally presumed whenever government sponsors research. 
We shall examine briefly some modes for minimizing risk from disclosure, 
focusing particularly, however, on forms of legal protection that government 
sponsors may provide for inquiry involving individual and corporate actors. 

Protection by Anonymity in Accessioning Participants and Eliciting In- 
formatio n. There are many different techniques for accessioning participants 
and eliciting information that minimize risk becausethey ipso fa c t o insure 
anonymity. These have been discussed briefly in general terms. No specific 



25-120 



catalogue of, then is presented here, ye previously indicated that two rules 
might well apply when there is risk of; harm from disclosure of information — 
provided that disclosure is not a matter of formal contract, as it may well 
be in much evaluation research. These rules may he stated: 

1, Information that may cause harm if disclosed should not be collected 
unless it is necessary to the particular inquiry; 

2, klien the objectives of a particular inquiry will not be undermined 
by either accessioning participants anonymously or by anonymous 
procedures for eliciting information Cor both), they should be. 
required over any ether procedure of accessioning or elicitation. 

Yet, as we have noted, there are distinct limits to the use of such 
anonymous procedures. Among those tvc- have mentioned are these: (1) identi- 
fying information is necessary to increase the validity and reliability of 
information and to estimate error in information; (2) identifying informa- 
tion is necessary for many designs that measure changes in the behavior of 
individual or corporate actors; (3) identifying information, and even its 
disclosure, may be necessary in evaluation research; (.4) identifying in- 
formation is necessary to some eliciting and data gathering procedures that 
are essential to a particular form of inquiry; and (5) identifying informa- 
tion is necessary when information from independently derived sources must 
be collated, e,g,, information derived from interviews and records of past 
behavior are brought together for the same individual or corporate actor. 
The list is not exhaustive but uniquely identified. Several rules may he 
stated with respect to safeguarding uniquely identified information from 
disclosure; 

1, Unique identifiers should not be collected unless they can be demon- 
strated to be essential to the particular inquiry. 

25-121 



2, Similarly, identifiers that, when taken collectively, provide unique 
identification should not be collected unless they are essential 

to a particular inquiry. 

3, Any identifiers should be separated from any information sources 
as soon as they no longer are essential to an inquiry; identifiers 
should then be destroyed unless it is demonstrated that they are 
necessary to soma later stage of inquiry, 

4, All information on identifiers that may be linked to information 
and all information that has a potential for unique identification 
should be physically protected from access by anyone other than 
authorized personnel. Institutional sponsors and principal investi- 
gators should be legally obligated to provide such physical pro- 
tection when there is risk of harm from disclosure. C28 CFR 22.23; 
C5); hi CFR 2a4;(C5). 

5, Access to uniquely identifiable data "shall be limited to those em- 
ployees having a need therefore, and that such persons shall be ad- 
vised of, and agree to comply with these regulations" (23 CFil 22.23: 
C2)). 

6, Provision shall be made for the final disposition of any identifiable 
materials either by their complete destruction upon completion of a 
research inquiry or by separation and destruction of any identifiers 
or by provision for maintaining their security to make possible 
longitudinal or continuing studies C28 CFR 22.25). Special attention, 
is called to the fact that unique identifiers pose special problems 
for retention, particularly when each bit of information has unique 
identification as it does in tape or video-tape recordings. More 



25-122 



stringent criteria for protection retention must apply to the retention of 
unique identifiers. Attention is called also to the fact that government 
sponsors have responsibilities under the Freedom of Information Act to notify 
and make accessible records that are uniquely identifiable, including re- 
search records they may acquire from sponsored research. Any transfer of 
records with uniquely identifiable information to a government sponsor thus 
poses enormous administrative- burdens of notification, problems of correct- 
ing a record , etc. 

Legal Protection for Compulsory Di s closure . 

The growth of behavioral science inquiry has brought with it the recog- 
nition that the information acquired has user, for other than scientific in- 
quiry. Information gathered by behavioral scientists; like that gathered by 
journalists often is useful to others as well. Law enforcement agents, leg- 
islative, executive, and judicial bodies often, find useful information that 
is uniquely identifiable and may seek to compel its disclosure to accomplish 
their own ends. It goes, almost without saying, that the ends of such bodies 
at times not. only conflict with those of behavioral science inquiry but taken 
collectively they threaten the very foundations of that inquiry by the ways 
in which information is used. 

The need to protect: information gathered in behavioral science inquiry 
from use by others is considerable. As Nejelski and Peyser (1975 :B-1) note, 
research participants haye "a paramount interest in keeping the invasion of 
their privacy to a minimum and making sure that the information will not be 
the basis for prosecution or reprisal." Moreover, investigators have an 
interest in maintaining that privacy to insure the continuing participation 
of participants and to insure the quality of the information they acquire, 

25-123 



Sponsors of research, have a similar interest. When the State is the sponsor, 
there may he conflicting &nterests™--to protect the integrity of the scienti- 
fic inquiry- by protecting confidentiality but also to compel its disclosure 
for its other ends. Yet in the broad raLher than in the narrow public in- 
terest, The State as Society would seem to have an overriding interest in 
protecting behavioral science inquiry from compulsory disclosure both in its 
general role of protecting free scientific inquiry and in its more special 
one as sponsor of specific research investigations. 

There are two major forms of legal protection proposed to protect be- 
havioral science inquiry from compulsory disclosure. The first form, that 
of a statutory privil e ge , protects from compulsory processes all information 
gathered in the course of an individual's research. These statutes are com- 
monly referred to as "shield .lav?s". They are designed to .neet the needs of 
the State, in terms of its general interest: in protecting all behavioral 
science inquiry from compulsory processes of disclosure. The second form, 
that of a c en f i d en t ia 1 i t y c ertificate , protects from compulsory processes all 
individually identifiable information that is gathered in a particular re- 
search study sponsored and funded by the federal agency issuing the certifi- 
cate. This form of protection meets the needs of a particular research 

•it 
sponsor and leaves unprotected any investigation where the government is not 

directly implicated as sponsor. It is obvious that a statutory privilege, 
since it offers general protection, has more far-reaching implications for 
the development of behavioral science than does the confidentiality certifi- 
cate. Each is now considered in somewhat greater detail. 

Statutory Protection . 

There are at the present time few federal and state statutes that are 



25-124 



specifically designed to protect research, investigators or research informa- 
tion and activity. A recent review of these statutes (Nejelski and Peyser, 
197 5: E-20-21] concludes that these statutes provide protection for only a 
small minority of all behavioral science investigators and investigations. 
They conclude, moreover, that there are major drawbacks to the limited and 
specific protection offered by current statutes, including those that provide 
a limited privilege for a given kind of research , such as drug research. 
Apart from the fact that they afford protection for only a small segment of 
the community in need of protection of confidentiality, some depend upon the 
discretion of officials for extension of the protection. As Nejelski and 
Peyser observe (1275;B«*211: "Such discretion, as well as the requirement 
that research era ba "licensed" before they receive protection, could severely 
threaten the freedom of researchers to pursue controversial avenues of in- 
quiry". The point is that the general interest of society in free scientific 
inquiry is much less well protected by statutes granting privilege for a 
specific inquiry titan by one that extends it to all qualified investigators 
and their research activity. 

We do not propose to discuss here in any detail proposals for a behav- 
ioral! science investigators shield law. An example is provided in the model 
statute proposed by Nejelski and Peyser (1975:B-9-ll) , Rather, we shall 
examine some of the issues that are raised by statutory protection and the 
resolution of these Nejelski and Peyser provide in a model shield law for 
behavioral science investigators, together with some of the reasons pertaining 
thereto. With them, we define the major issues to be those of who is to be 
covered by the statutory privilege, to what matters shall the privilege ex- 
tend, what is the scope of the protection, including possible limitations, 

25-125 



wjio way inyoke the privilege, and right of waiver. 

Who is to Be Covered ? Nejelski and Peyser 0-975 ;B'-31-321 observe that 
there are four principal ways of defining who is to be covered. The first 
is to simply name a category such as behavioral scientists, leaving undefined 
who is a behavioral scientist. Such an approach involves serious ambiguity 
that must need be settled by litigation. A second approach is to extend the 
privilege to certain kinds of information, such as particular kinds of re- 
cords. This is the least ambiguous of all definitions of coverage but un- 
fortunately leaves many kinds of legitimate inquiry without protection. One 
might add that statutes that grant discretion to officials to decide what 
Is to be covered by limiting it to a specific inquiry have a similar limita- 
tion. The third major way is to specify a relationship between the person 
protected and a specific type of research activity, such as might be the 
case in granting protection to all persons who are engaged in research on 
the use and effect of drugs (be it noted as is the case now with federal 
legislation limiting that privilege to investigations under federal sponsor- 
ship) . The advantage of this approach will depend upon the extent to which 
it cau effectively cover a sufficiently large number of categories of re- 
search so as not to unduly restrict inquiry. The fourth approach is what 
Nejelski and Peyser i.dentify as the functional approach and the one they 
use for their model statute. The functional approach confers protection on 
all individuals who perform a particular role in a specified way . This ap- 
proach, they note, has the advantage of covering all individuals in all fields 
of inquiry including biological and natural as well as hehavioral sciences 
if their activities conform to a specified pattern of behavior. Moreover, 
a functional approach extends the protection to all individuals involved in 



25-126 



the research process, not simply to those who actually elicit information 
from individuals . I note parenthetically that a limited functional approach 
is followed in some of the proposed federal regulations for confidentiality 
certificates, the alternative approach discussed later. 

In considering the matter of what role is to be covered in what speci- 
fied way, a number of issues arise. The problem of who is a behavioral 
scientist investigator is an especially difficult one since any mode of 
resolution by statute has limitation.". Licensing poses problems of creating 
licensing authorities who may serve as gatekeepers. To require particular 
affiliations has similar limitations. To resolve the matter, Nejelski and 
Peyser propose to sacrifice specificity and precision "...to accommodate 
al l those who have a bona fide involvement with research activity". They 
extend coverage to all individuals who in some way deal with information 
"obtained employing principles recognized or standards accepted in the field 
of inquiry" (1975:3-53). 

A statutory protection must conform, of course, to the requirement that 
the research activity have a public benefit, to square the statute, as 
Nejelski and Peyser note, (1975:B*-34) with the First Amendment. 

To Wh at Matters Shall t he Privilege Extend ? A general statutory privil- 
ege protecting research investigators from compelled discolsure of informa- 
tion would exempt investigators from their civic obligation to provide evi- 
dence in civil and criminal proceedings only when the information sought was 
obtained from research activity. The statute should restrict coverage "to 
information handled 'in the course of' research activity" CNejelski and Peyser, 
1975;'B-r-35) to make certain that investigators are not covered for material 
obtained in their other roles, There will, of course, be grey areas in the 

25-127 



use of some behayioral science techniques with, this prpyision, such as in 
participant observation where the research investigator has difficulty de<- 
termining when research, activity begins and ends as, for example, when a 
participant observer lives in a community to investigate compliance and 
conformity in it . 

Any general statute should extend protection from compelled disclosure 
to all information that is obtained in the course of the research inquiry 
whether or not it is specifically covered by the research design and whether 
or not it is a matter of implicit or explicit promise in securing informed 
consent by promising confidentiality. This is a critical provision in all 
forms of protection from compelled disclosure. An example may show how 
broadly this provision might apply and why it is essential. Suppose one is 
doijic 8 sample survey of people's attitudes toward child abuse and has con- 
sent to enter a private place to conduct the interview and the informed 
consent of the participant to interview about these attitudes. Now suppose 
that during the course of the interview the respondent punishes the child 
in such a way that it might well constitute child abuse. Let us~ further 
suppose that in a judicial proceeding information is sought from the inter- 
viewer on that incident of "child abuse". Should it be exempt from compelled 
disclosure? Our answer is that it should be for a number of reasons shared 
also by Nejelski and Peyser 0-975 j B-35-36) . 

First, any participant's interest in keeping the invasion of their 
privacy to a minimum and in Insuring that any information they provide either 
orally or otherwise will not open them to prosecution or other possible 
harm from disclosure can be fostered only when the protection of the statute 
does not rest in a promise of confidentiality. 

25-128 



Second f whi.le serious ethical issues can Be raised in particular in- 
stances about granting such protection, the protection of any person should 
not depend upon the fortuitious circumstance of whether or not the investi- 
gator explicitly- promised confidentiality for given information. As we have 
had occasion to note previously, any extension of confidentiality often is 
regarded by participants as a trust relationship — they come to have confi- 
dence in the interviewer as the relationship proceeds and even any explicit 
statements of exemption marie in securing informed consent may come to be 
"forgotten" as trust develops. Such risks are generally less where the 
relationship is of short duration as with some research techniques, but 
as in others, e.g., prolonged observation or treatment, the trust ralaticn- 
ship may become considerable. Xt is difficult for research investigators 
and participants to avoid those elements of trust and participants should 
not be placed in jeopardy because they either disclose or behave in ways 
that provide information that is potentially harmful to them. 

Third, there are a number of modes of inquiry that often preclude ex- 
plicit or implicit promises of confidentiality because of what they are 
measuring, as we have noted before in tests of personality or of qualities 
such as prejudice and discrimination, as will occur in direct observation 
of human behavior, and as in the study of social relationships. One cannot, 
as noted before, study social relationships without becoming privy to informa- 
tion about others whose consent was not obtained and who deserve the protec- 
tion of confidentiality. A wife who talks about her relationship with her 
husband is not the only party subject to protection j her husband is as well 
if that information is also potentially harmful to him. 

Finally, it should be noted that if any statute were to be open to 



25-129 



considerable litigation as might well be the case when what is at stake 
was what is and what is not covered by the privilege, it will lose its 
protection and the benefits to scientific research that it is designed 
to provide. Ue would maintain that an exclusion from protection of all in- 
formation that was not a specific matter of consent will open the protection 
to just such damage. 

We shall simply note here that: matters we have previously discussed 
must also be covered by any viable, shield lav;: (1) the identity of the sub- 
ject, whether by unique identifiers or other means of unique identification, 
including the specific knowledge that they were approached and refused in- 
formed consent since that may be damaging; (2) the contents of all communi- 
cations with any participant, any information acquired through direct or 
indirect modes of observation, and the work product of investigators. The 
reasons for being sure these are protected have already been provided. 

What Shall be the Sc o pe of th e Protection and any Limitations ? There 
are a number of reasons, as Nejelski and Teyser note 0-975 :B41-42) , why it 
is not quite appropriate to regard a research investigator privilege as 
either absolute or qualified. Those matters aside, ideally one wants to 
provide the maximum possible protection, given the ever present problem that 
the "law is what the courts decide". The question of maximum possible pro- 
tection perhaps is best approached by answering the question of the circum- 
stances under which the privilege will be divested while seeking the maximum 
possible coverage. 

There are many types of proceedings to which the privilege might apply, 
including legislative, executive and judicial proceedings. They include 
investigative and adjudicatory proceedings. While it can be maintained that 



25-130 



invest Igatiye. proceedings are potentially more damaging than adjudicatory 
proceedings and the privilege should extend only to the former, particularly 
if the identity of the participant or source of information is specifically 
excluded in adjudicatory proceedings^'- in short, that a qualified privilege 
extend to the contents of communication. There is considerable risk in 
trying to maintain that distinction and a simple example may show why. Were 
one to report that all of the participants in a given inquiry were, say, 
drug usery and were it known from some independent source that a given person 
was a participant in the study, identity remains unprotected. Even more 
qualified statements about subgroups can similarly lead to disclosure. I 
note, parenthetically, that investigators have an obligation to protect iden- 
tity in the maimer they report research results and that if statements are 
made of the sort above, they risk exposure of identity. 

Quite clearly, ell compu lsor y pr oceedi ngs;, whether l egislative, exec u- 
t ive, or judicial., shou ld be cove re d if maximum protec tion is desired . The 
language of the proposed protection of identity in human subject research 
of DHEW might well apply to a general statutory privilege; "Persons. . .author- 
ized may not, at any time, be compelled in a Federal, State or local civil, 
criminal, administrative, legislative, or other proceeding to identify the 
research subjects encompassed by the Certificate, except in those circum- 
stances, specified,,." (matters of waiyer) (42 CFR2a7 ; (a) ) . Note that the 
scope extends here to all leyels of jurisdiction, a matter that clearly re- 
quires separate legislative authority, 

The matter of whether there should be further qualification dependent 
upon other oyerriding interests is also at stake in a statutory privilege 
whether general or specific. Among the major overriding interests often 

25-131 



considered are those of national security, law- enforcement, prior crimes 
and future crimes, Ife shall not review the arguments for and against their 
inclusion or exclusion. The reader is referred to Nejelski and Peyser for 
arguments against qualification for information relating to national security, 
law enforcement, and prior crimes, arguments that appear to this research 
investigator as compelling. There is agreement, however, that information 
on further crimes should be subject to compulsory disclosure, particularly 
for the more serious cr heinous crimes . 

There are, finally, some issues of a statutory privilege for research 

investigatory interests conflicting with constitutional interests and other federal 
or state statutes. These matters would require an extended discussion, 
some of which is given by Nejelski and Peyser CL975:B-48-55) . We would make 
note here only of the real possibility that a criminal defendant should not 
be violated by any statuatory privilege. There is some risk that if the 
contents of communications as well as specific identity of sources are ex- 
cluded by statute it violates a defendant's Sixth Amendment interest — the 
right of the accused in criminal prosecutions to have compulsory processes 
for obtaining witnesses in his favor. The researcher privilege previously 
mentioned covering the content of communications in criminal prosecutions is 
therefore potentially in conflict with the Sixth Amendment rights of persons. 
Because the research investigator's privilege status rests in the First 
Amendment interest in proyiding the public with information, the statute 
provides the possibility of conflict between two constitutionally protected 
interests, A defense subpoena permitting the defense an exception "...opera- 
tive only if the defendant is acting in good faith in requesting the contents 
of communications or observations of the researcher" (Najelski and Peyser, 
1975:B-A9) may therefore be necessary in balancing First and Sixth Amendment 

25-132 



rights, There should, ho^eyer, be no exception to the provision that the 
identity of all research participants be protected and if, therefore, the 
effect of disclosure of contents is to disclose the identity of participants, 
the protection of Identity of subjects should be overriding. 

We -make simple note in passing that botpL the Feder a l Reports Act and 
the Freedo m of Information Act are federal acts that would need to be ac- 
comodated to the kind of proposed federal statute as described in any 
federal statute. 

Who May Invo k e the Priv x3.".g°.? A central issue in invoking the privilege 
is who assumes the burden of proof for qualification, It can be placed either 
upon the person asserting the privilege or upon the party requesting the in- 
formation. Placing the burden of proof on the person asserting the privilege 
would require some form of proof that the information sought is research 
data as defined by the statute while placing it upon the p~rty requesting the 
Information requires proof that what is sought are not research data. Any 
failure liy the requesting party to sustain this burden means the privilege 
is automatically effective. There are a number of reasons why the burden 
should perhaps fall upon the party asserting the privilege, the most compelling 
being that if the privilege confers the broad coverage deemed necessary for 
effective protection, it should be relatively easy for investigator's to in- 
voke the privilege and the burden should therefore fall upon the investigator. 

Nejelski and Peyser Q975;B-56) also note that the research investiga- 
tor's privilege can be further strengthened if the situations in which a sub- 
poena can be issued are carefully circumscribed by statute. 

Who Hay Waiver Privilege, When and how ? The proposed statxite is designed 
to provide maximum possible protection against co mpulsory processes of 



25-133 



disclosure of the Identity of participants in research, and any information 
connected with, research, activity, The question arises, however, whether 
there should Be any power to divest the privilege hy Voluntarily disclosing 
privileged inf ormation. On the face of it, there is compelling argument 
that the person who provided the information should have the right to divest 
the privilege. Yet the matter of divesture is more complex, particularly 
when it is kept in mind that the power to waive any privilege is to provide 
substantial control over its exercise. Whenever information is provided, 
however, both participants and investigators acquire some right in the in- 
formation and its disclosure. While the research participant clearly has 
the greater stake in the information, that of the investigator is not insub- 
stantial, The investigator has obligations to protect information that per- 
tain?, at the same ti.F.e, to others as well as to the participant and to pro- 
tect the integrity cf the specific inquiry which might well be damaged were 
disclosure to take place, e.g., while the investigation is still in progress. 

One way of balancing these rights is to require that both the partici- 
pant and the investigator must voluntarily divest themselves of privilege, 
a resolution opted for by Nejelski and Peyser (1975:B-60) . There could be 
some qualification on the investigator's right, however, by providing that 
he has the right to withhold consent only on proper showing of its potential 
damage to the investigator or others if disclosed. 

Lest considerable damage be done to the statutory privilege on compul- 
sory disclosure by an absolute right of the research participant to voluntary 
waiver, proyision should he made to limit waiver to only certain situations. 
The one obvious condition is to when waiver should apply is that when the 
party or parties who are empowered to waive the privilege can do so only in 



25-134 



response to a subpoena, or other legal process. Any other disclosure of the 
information, whether hy the parties to the research activity or by others, 
should not dissolve the privilege, Unless specifically exempted, then, 
government agencies, for example, would not have access to specifically iden- 
tifiable information or the identity of participants, including access for 
audit or as a research sponsor. The role of the government in these latter 
respects is not unimportant, and, as ve shall later note, those powers are 
reserved in granting a confidentiality certificate. 

Earlier we noted that the presencte of third parties makes it difficult 
to protect conf identiality, since they are always potentially a source of 
disclosure of confidential information. Yet inevitably in some behavioral 
science inquiry tbcre are third parties present. Their presence, however, 
should not divest the privilege of confidentiality, however, as Nejelski 
and Peyser conclude 0.975'.^- 60-61) ; "Logically, the presence of a disinter- 
ested third party would destroy confidentiality at the outset." But, "The 
researcher's privilege as provided in the proposed statute... is not based on 
confidentiality. In addition, the professional privileges protect only in- 
formation revealed in the course of a direct conversation between the pro- 
fessional and client. The researcher's privilege protects information ob- 
tained by the researcher employing techniques that involve methods other 
than direct communication, . .If the privilege were automatically waived when 
a third disinterested party was present, the protection given in the men- 
tioned situations would be meaningless." 

These, then, cover the main elements and reasons for them in a general 
statutory privilege protecting research investigators and their participants. 
There are good reasons to maintain that such protection should be afforded 



25-135 



all inquiry where human life i,s involved, whether as individuals or col- 
lectively-. Yet there are both, practical and other reasons why this course 
might not be taken. Practically, such protection requires considerable 
legislative activity by the Federal government and the States. There will 
be far from uniformity in the enacted statutes adopted by states and such 
protection might be long in coming. While it might be commended as a long- 
run strategy for protection since it provides protection for all legitimate 
scientific inquiry on human beings and their social life, in the meantime 
there also is a need for protection. The role of government in fostering 
the right of the public to information, moreover, is clear and unmistakable 
when it is the research sponsor. We turn, therefore, to the ways that the 
federal government may protect confidentiality in its role as research 
sponsor, dealing specifically with protection through the discretionary 
granting of confidentiality certificates. Before doing so, we simply note 
that the federal government can do so in other ways. It may, for example, 
provide protection for a given kind of research categorically specified at 
law. This is done., for example, at the present time, for soma research on 
drug use. Some protection also is provided if the government choses to 
interplead in a given proceeding, and so on. We shall focus on the confi- 
dentiality certificate, however, because of its special status in proposed 
federal regulations by DHEW (42CFR2a) and LEAA C28CFR22) and note particu- 
larly that the LEAA, proposed regulations have some advantages for investiga- 
tors and participants not now included in the proposed DHEW regulations. 



25-136 



Confidentiality Certificates . The LEAA proposed regulations refer to a 
Privacy Certificate while those of DHEW refer to a Confidentiality 
Certificate. The purpose of these certificates is "to protect the privacy 
of individuals by requiring that information identifiable to a private 
person obtained in a research or statistical program funded by LEAA may 
only be used and/or revealed for the purpose for which it was obtained" 
(23 OFR22) and "The proposed amendment sets forth procedures under which 
persons engaged in research on mental health, including research on the use 
&nd effect of alcohol and other psychoactive drugs, may apply for an 
authorization under section 303 (a) of the Public Health Service Act 
(42 U.S.C. 242a (a)) as amended by Pub. L. 93-282, to protect the privacy 
of the research subjects by withholding from all persons not connected 
with the conduct of the research the names and other identifying charac- 
teristics of such research subjects." (42 CFR 2a). The certificate, in 
both cases, is granted to the institutional sponsor for a proposed study 
by a designated principal investigator(s) . We make note of the fact that 
the LEAA protection has somewhat less scope than that of DHEW, since LEAA 
extends the protection to "information identifiable to a private person" 
where a private person includes "any individual, partnership, corporation, 
association, public or private organization ... or combination thereof 
. . . other than an agency, or department, of Federal, State, or local 
government, or any component or combination, thereof" (28 CFR 22.2 (a), 
(b)) making it inapplicable to government agencies while DHEW includes 
them: "Person means any individual, corporation, government, or govern- 
mental subdivision or agency, business trust partnership, association, or 
other legal entity" (42 CFR 2a. 2(b)). 

We note in passing that statutory authority, of course, is essential 

25-137 



to make the issuance of aconf identiality certificate possible. Such statutory 
authority is now provided by the Congress for only.' a limited number of federal 
agencies for their behavioral science research. 

Degree of Protection Afforded . In describing the protection afforded 
the introduction to the DHEW regulations note ( Federal Register 40:234: 
56693); 

The proposed regulations provide that persons receiving 
an authorization of confidentiality may not be compelled in 
any Federal, State, or local civil, criminal, administrative, 
legislative, or othur proceeding to identify the research 
subjects encompassed by the authorization (2a. 7(a)), but 
that such persons are not authorized to refuse to reveal 
identifying information where (1) the subject (or, if 
legally incompetent his guardian) consents, in writing, to the 
disclosure of identifying information, (2) the medical welfare 
of the subject would be threatened by a failure to reveal such 
information, or (3) release of such information is required 
by the Federal Food, Drug, and' Cosmetic Act or the regula- 
tions promulgated thereunder (2a. 7(b)). The purpose of these 
exceptions is to prevent the protection against compulsory 
disclosure of identifying information from being invoked to 
the detriment of the research subject. 

The regulations also set forth procedures on termination 
of Confidentiality Certificates. The protection afforded by 
a confidentiality certificate is, however, permanent with re- 
spect to subjects who participated in research during any time 
the authorization was in effect. 

In the proposed DHEW regulations research means "... nvy activity 

that is intended and designed to establish, discover, develop, elucidate, 

demonstrate, or confirm information or procedures. The term includes, but 

is not limited to, behavioral science studies, surveys, evaluations, and 

clinical investigations " (42 CFR 2a. 1(c)). Clearly this is a sufficiently 

broad definition to encompass what we have previously addressed as behavioral 

science inquiry. Yet, it must also be clear, that the discretionary authority 

to decide whether a particular inquiry qualifies is left to the Secretary 

or other persons to whom that authority is legitimately delegated. LEAA may 

have a somewhat broader definition, providing that "Research or Statistical 

25-138 



information — means any information which is collected during the conduct of 
a research or statistical project or derived from such information, and 
which is intended to be utilized for research or statistical purposes. 
The term includes information which is collected directly from the individual 
or obtained from any agency or individual having possession, knowledge, or 
control thereof" (28 CFR22) . Despite this somewhat broader definition, 
the published proposed LEAA regulations specifically excluded from re- 
search "... information which is unrelated cc project research and 
statistical objectives" (28 CFR 22.23(4) & 22.27(4)). However at recent 
hearings on the proposed regulations there was apparent agreement to 
eliminate this latter restriction for reasons already discussed in our 
prelection of confidentiality section. DREW is silent on the matter so 
that much would depend upon how the research clause is construed. 

Both DHEW and LEAA provide explicit protection relating to "identifying 
characteristics." The DHEW regulations may have a somewhat more limited 
protection as identifying characteristics as " . . . refers to any data 
collected on an individual by a researcher that contains his name, or 
the identifying number, symbol, or other identifying particular assigned 
to the individual which could reasonably distinguish that individual from 
all others in the study, including but not limited to fingerprints, voice- 
prints, or photographs" (42 CFR 2a2:(g)); though the definition of person 
as already noted includes all individual and corporate actors. LEAA makes 
explicit that "information identifiable to a private person — means informa- 
tion which either (1) is labelled by name or other identification, or (2) 
can by virtue of sample size or other factors, be reasonably interpreted as 
referring to a particular private person" (28 CFR 22.2: (e)), though as 
noted, the definition of a private person specifically excludes government. 

25-139 



Regardless of which is more limited in what respects, the definition of 
private person in proposed DHEW regulations and of identifying characteristics 
in LEAA proposed regulations may afford the maximum possible protection* 

The question of who is to be afforded protection and how to be eligible 
for a certificate likewise differs among proposed regulations. DHEW stipu- 
lates that any person engaged in the applicable research described above 
". . . who desires authorization to withhold the nai^ies, and other identifying 
character is Lies of individuals who are the subject of £ uch research from any 
person or authority not connected with the conduct of such research may apply 
to the Office of the Director of the National Institute on Drug Abuse . „ . , 
National Institute of Mental Health, or . . . National Institute on Alcohol 
Abuse and Alcoholism . . , for an authorization of confidentiality. Such an 
application may accompany the submission of an application for grant or con- 
tract assistance" (42 CFR 2a3:(a)). The proposed regulations thus applies 
only to some DHEW sponsored research. The LEAA proposed regulations stipulate 
that "Each applicant for LEAA support either directly or under 3 State plan 
shall submit, as a condition of approval of any grant application or contract 
proposal, a Privacy Certificate" (28 CFR 22 .23: (a)). Since a considerable 
range of kinds of research can be sponsored by LEAA and none is excluded, 
there are no restrictions by specific kind of research sponsored by the 
agency--though it might be argued that all DHEW sponsored research would be 
comparable to all Department of Justice sponsored research. In any case, so 
far as the issues confronting the National Commission for the Protection of 
Human Subjects of Biomedical and Behavioral Research are concerned, it should 
be apparent that only the behavioral science research under the above 
sponsors could be protected by a Confidentiality Certificate. 



25-140 



All such research should be protected where confidentiality and its protec- 
tion is at stake. 

Requirements for Certification . We shall not review in detail the 
specific requirements that institutional sponsors and investigators must 
meet to be eligible for a Certificate of Confidentiality in both proposed 
DHEW and LEAA regulations. We shall simply make note of some, since these 
requirements are discussed in many sections of this paper, while reserving 
comments for others. The following are the major requirements (DHEW 42 
CFR 2a4): 

1. The Secretary may require any pertinent information other than 
that specified below; 

2» "The name and address of the individual primarily responsible for 
the conduct of the research and the sponsor or institution with 
which he is affiliate, if any; 

3. "The location of the research project and a description of the 
facilities available for conducting the research, including the 

name and address of any hospital, institution, or clinical laboratory 
facility to be utilized in connection with the research"; 

We note that for much behavioral science inquiry while the location of the 

project can be specified in the application, the specific location of sites 

where procedures will be performed is not generally available. The most 

one may be able to specify is the kind of site, e.g., a stratified 

probability sample of U. S. households. . 

4. "The names, addresses, and summaries of the scientific or other 
appropriate training and experience of all personnel having major 
responsibilities in the research project"; 

We would only make note of the fact that some of these persons may be 

known only after funding and employment so that provision should be made 

to supply them at some later point, if that is deemed essential; otherwise 

a statement of their qualifications when employed should suffice. 

5. "(i) An outline of the research protocol for the project including, 
where applicable, the following information: (ii) A statement of 

25-141 



the methodology to be followed Including: (A) The number and types of 
subjects (e.g., age, sex, education) who will be used in the research 
project; (B) The type of information which is to be collected and the 
instruments and methods for such collection; and (C) The procedures 
for safeguarding of data on the research subjects, which shall include 
as a minimum an assurance that records containing any information 
pertaining to a research subject shall be kept in a locked file 
cabinet, safe, or other similar container when not in use; and 
(iii) A statement: (A) From applicants who receive DHEW grant of 
contract support for the research project with respect to which a 
Confidentiality Certificate is requested assuring that they will 
comply with all the requirements of 45 CFR Part 46, "Protection of 
Human Subjects," or 

(B) From all othtr applicants assuring that they will, if it is 
determined by the Secretary, on rh« basis of information submitted 
by the applicant, that (1) the subjects will be placed at risk and 
(2) a decision to allow the subjects to accept the risks is warranted, 
comply with the informed consent requirements of 45 CFR 46.3(c) and 
document legally effective informed consent in a manner consistent 
with the principles stated in 45 CFR 46.10. If a modification of 
paragraphs (a) or (b) of 45 CFR 46.10 is to be used, as permitted 
under paragraph (c) of this section, the applicant will describe the 
proposed modification and submit it for approval by the Secretary. 

(5) The estimated date for completion of the project; 

(6) A specific request for authority to withhold the names and 
other identifying characteristics of the research subjects and the 
reasons supporting such request; 

(7) Ail assurance that if an authorization of confidentiality is 
given it will not be represented as a general endorsement of the 
research project by the Secretary or used to coerce individuals 
to participate in the research project; and 

(8) An assurance that the research subjects will be immediately 
advised of any termination of the authorization of confidentiality. 
(See 2a. 8(c)). 

We make specific note of only two provisions here that may raise some 

questions. 

Requirement 8 states that one must grant assurance that "an authoriza- 
tion of confidentiality . . . will not be represented as a general endorser-, 
ment of the research project by the Secretary or used to coerce individuals 
to participate in the research project." The requirement seems a reasonable 
one, only if certain matters are explicit. To effectively represent to 
potential participants that one can afford the protection provided by the 
confidentiality certificate one must be able to make representations that 
such protection is afforded by Federal regulations and on request furnish 

25-142 



proof that such a Certificate of Confidentiality has been issued. Indeed 
when matters of potential harm from disclosure of confidential information 
are at stake, one may have an affirmative obligation _to provide a. copy of 
the certificate to all potential partic ipants to f ully inform them of the 
nature of t hat prot ection so that one. meets the requirement of an "informed 
consent." That in doing so, one risks the possibility, even the likeli- 
hood, that some participants wil] on having that information change their 
minds and become participants should not be interpreted as "coercive" of 
participation. A "reasonable and informed man" might well change his/her 
mind when provided with a copy of the Confidentiality Certificate. 

Since the LEAA Privacy Certificate is applied for in connection with a 
regular research application, no special provisions of the foregoing are 
stipulated. It is assumed that the obligation to provide the;. 1 , in DREW 
sponsored research, including the requisite assurances, must be made when 
they occur in conjunction with a regular application, an option that is 
provided. 
The Certi ficate of Confidentiality or Privacy Certificate and Its Limits 

The proposed DHEW regulations provide some general guidelines for the 
Secretary tc take into account in issuing a Confidentiality Certificate 
(42 CFR 2a6) while they are only implied in the statement of purpose for 
the Privacy Certificate in LEAA proposed regulations (28 CFR 22.1). 

The discretion of the Secretary is constrained to take into account: 

(1) The soundness of the purposes and methods of the research 
project; 

(2) The scientific or other appropriate training and experience 

of all personnel having major responsibilities in the research project; 

(3) The suitability for use in the research project of the proposed 
subject population and the protections to be afforded to subjects; and 

(4) Such other factors as he may consider necessary and appropriate. 
All applications for confidentiality ceritif icates shall be evaluated 

by the Secretary through such officers and employees of the Department 

25-143 



and such experts or consultants engaged for this purpose as he 
determines to be appropriate. 

(b) After consideration and evaluation of an application for an 
authorization of confidentiality, the Secretary will either issue a 
Confidentiality Certificate or a letter denying a Confidentiality 
Certificate, which will set forth the reasons for such denial, 

or will request additional information from the applicant. 

The LEAA implied guidelines refer to matters of protecting privacy 
and clarifying the purposes for which identifiable information may be used 
or revealed. It likewise seems apparent that the requirements for informa- 
tion on application are addressed to the major criteria governing privacy 
certification. 

Elements in the Confidentiality Certificate . The proposed DHEW regula- 
tions stipulate the elements in the Confidentiality Certificate and major 
limitations on its protection and use (42 CFR 2a. 6(b), (c) , (d)). 

The Confidentiality Certificate will include: 

(1) The applicant's name and address; 

(2) The name and address of the individual primarily responsible 
fcr conducting the research, if such individual is not the applicant; 

(3) The location of the research project; 

(4) A brief description of the research project; 

(5) The Drug Enforcement Administration registration number for 
the project, if any; and 

(6) The date of expiration of the Confidentiality Certificate. 

(c) A Confidentiality Certificate is not transferable and is 
effective only with respect to the names and other identifying 
characteristics of those individuals who are the subjects of the 
single research project specified in the Confidentiality Certificate. 
The recipient of a Confidentiality Certificate shall, within 15 

days of any completion or discontinuance of the research project 
which occurs prior to the expiration date set forth in the Certificate, 
provide written notification to the Secretary. If the recipient 
determines that the research project will not be completed by the 
expiration date set forth in the Confidentiality Certificate he may 
submit a written request for an extension of the expiration date 
which shall include a justification for such extension and a revised 
estimate. of the date for completion of the project. Upon approval 
of such a request, the Secretary will issue an amended Confidentiality 
Certificate. 

(d) The protection afforded by a Confidentiality Certificate does 
not extend to significant changes in the research project as it is 
described in the application for such Certificate (i.e., changes in 
the personnel having major responsibilities in the research project, 
or changes in the research protocol affecting the number and types of 

25-144 



research subjects or the nature of their participation in the project). 
The recipient of a Confidentiality Certificate shall notify the Secre- 
tary of any proposal for such a significant change by submitting an 
amended application for a Confidentiality Certificate in the same form 
and manner as an original application. On the basis of such applica- 
tion and other pertinent information the Secretary will either: 

(1) Approve the amended application and issue an amended Con- 
fidentiality Certificate together with a Notice of Cancellation 
terminating the original Confidentiality Certificate in accordance 
with 2a. 8; or 

(2) Disapprove the amended application and notify the applicant 
in writing that adoption of the proposed significant changes will 
result in the issuance of a Notice of Cancellation terminating the 
original Confidentiality Certificate, in accordance with 2a. 8. 

We note especially the provisions stating that "The Confidentiality 
Certificate does not extend to significant changes in the research project 
as it is described in the application for such Certificate" and that "... 
the recipient of a Confidentiality Certificate shall notify the Secretary 
of any proposal for such a significant change by submitting an amended 
application for a Confidentiality Certificate in the same form and manner 
as an original application." This provision to he sure appears quite 
reasonable on grounds of holding investigators accountable so that they 
do not extend the range of inquiry unduly to cover matters that invade the 
privacy of participants and that might not otherwise be approved by sponsors 
of the Conf identiality Certificate as well as to constrain against altering 
substantially risks of participants. Yet, given the relative lack of 
guidelines (provided only by a few examples) as to what constitute "sig- 
nificant changes," it can lead both to improper regulation of scientific 
inquiry and to burdensome administration and decision-making. Many 
behavioral science studies undergo a large number of small changes as they 
proceed; it is more likely to occur with some designs than others. Such 
small changes might be regarded by others to cumulate into a "significant 
change." Experimental designs are less likely to involve such modifications 



25-145 



than other designs. In general, the more systematic the design and proce- 
dures used, the fewer the modifications called for. But the more exploratory 
the inquiry, the less likely it- is to utilize more systematic methods. 
Exploratory investigations and the less experimental methods might be 
burdened unnecessarily if no provision is made for approving modifications 
within limits that construe significant in a broad rather than a narrow 
sense. As it stands, the term "significant"is perhaps so ambiguous as 
to pose a questionable standard for regulation. 

The power of the Secretary, moreover, to disapprove such proposed 
changes can pose problems of serious interference in scientific inquiry 
since ther e are virtually no guidelines in the proposed r egulations to 
c onstrain his discretion . Moreover, the notification that any such changes 
will automatically entail the issuance of a Notice of Cancellation terminat- 
ing the original Confidentiality Certificate could be tantamount to cancelling 
any further inquiry deemed appropriate by an investigator and approved by an 
Institutional Review Board. While it may be necessary to utilize such 
sanctions to effectively control project alterations by investigators, it 
would appear that unless constraints are imposed on how judgment is to be 
made regarding "significant changes in the research project," any investigator 
is open to arbitrary control of the research design. 

Perhaps it would be more reasonable to leave changes in design to in- 
vestigators and their institutional sponsors, setting guidelines that any 
changes not alter the basic objectives set forth in the original inquiry 
provided they do not increase the risks from harm that participants assume. 
The research sponsor might then be expected to approve the changes and they 
would be covered by the Confidentiality Certificate unless when the Secretary 
is notified of these changes, he is obliiged to set forth specific reasons 

25-146 



why the proposed changes do not meet criteria for approval. This procedure 
would place the burden of proof upon the institutional sponsor and on the 
Secretary issuing the Confidentiality Certificate. Both should be obliged 
to set forth explicitly the reasons supporting any adverse decision before 
an amended application can be rejected for protection by a Confidentiality 
Certificate. When an Institutional Review Eoard rejects an investigator's 
modifications that are to be covered by a Confidentiality Certificate, the 
investigator should have e right to "appeal" that decision to the Secretary. 
Both the Institutional Review Board's explicit statement of reasons for 
rejection and the investigator's rejoinder should be forwarded in that case 
to the Secretary for a final decision. To do otherwise is to raise the 
spectre of unwarranted inference in scientific inquiry. 

Protect ion Affor ded wi th Waive r and Other Excepti ons . The DHFW 
proposed regulations set forth the following provisions regarding the effect 
of a Confidentiality Certificate and exceptions to those effects (42 CFR 
2a. 7): 

2a. 7 Effect of Confidentiality Certificate: exceptions. 

(a) Subject to the exceptions set forth in paragraph (b) of 
this section, a Confidentiality Certificate authorises the with- 
holding of the names aad other identifying characteristics of 
individuals who participate as subjects in the research project 
specified in the Certificate while the Certificate is in effect. 
The authorization applies to all persons who, in the performance 

of their duties in connection with the research project, have access 
to information which would identify the subjects of the research. 
Persons so authorized may not, at any time, be compelled in any 
Federal, State, or local civil, criminal, administrative, legislative, 
or other proceeding to identify the research subjects encompassed by 
the Certificate, except in those circumstances specified in paragraph 
(b) of this section. 

(b) Exceptions. A Confidentiality Certificate granted under this 
part does not authorize any person to refuse to reveal information 
which would identify a research subject where (1) the subject (or 

if he is legally incompetent, his guardian) consents, in writing, to 
the disclosure of such information, (2) the medical welfare of the 
research subject would be threatened by a failure to reveal such 
information, or (3) release of such information is required by the 
Federal Food, Drug, and Cosmetic Act (21 U.S.C. 301) or the regula- 

25-147 



tions promulgated thereunder (Title 21, Code of Federal Regulations). 

The proposed DHEW regulations basically provide protection against 
compulsory disclosure, of identifying information. As noted earlier this 
provision offers considerable protection to both participants and inves- 
tigators. The limitation of the protection to identifying characteristics, 
which includes the data so identified, provides sufficient protection in 
a large proportion of behavioral science studies. Yet, as noted earlier, 
the power u> compel disclosure of all other information, including the 
materials of investigators, could expose certain individual and corporate 
actors to barm simply because at titr.os it is difficult to determine what 
is an identifying characteristic that might bring disclosure and what 
information others have that would make identif ication possible. It is 
well to bear in mind that the behavioral scientist is not the only one who 
nay possess information with identifying characteristics; others may also 
have possession of some information. Where there is overlap in the two 
Lets of information, each can become privy to the information of the other — 
a technique of expanding the amount of intelligence not unknown to intel- 
ligence and law enforcement 2ganci.es. Thus the simple removal of identifying 
characteristics does not guai"antee that given some overlap in information by 
others, they cannot become privy to the confidential information the inves- 
tigator seeks to protect. For that reason alone, one should be obligated to 
protect all the information at an Individual level if there is any risk of 
harm on disclosure and, correlatively, that protection should be afforded 
against compulsory disclosure of all information. The appropriate standard 
then is that set forth in our discussion of shield laws — To protect all 
inf o rmation that is gathered by research acti v ity . Protection of all 
information related to a criterion of research activity provides greater 

25-148 



protection than does one based on a criterion of identifiable information. 

The LEM proposed regulations provide (28 CFR 22.28): "(a) Research 
or statistical information identifiable to an individual and/or copies there- 
of shall be immune from legal process and shall only be admitted as evidence 
or used for any purpose in any action, suit, or other judicial or adminis- 
trative proceeding with consent of the individual providing such information, 
or, in any case in which information is obtained through means other than 
direct inquiry of the individual to whom the data pertains." Quite similar 
to the DHEW provision, it omits reference to legislative proceedings and 
while providing, as noted earlier, a somewhat broader definition of what is 
meant by "information identifiable to a private person," it still does not 
provide protection for other information connected in the course of research 
activity, Both regulations, as now proposed, offer no protection for 
information that is collected by inadvertence or as a consequence of natural 
occurrence in social situations to which investigators become privy, an 
omission we noted earlier that should be corrected in the interest of be- 
havioral science inquiry. 

The DHEW regulations make no prevision for protection of the informa- 
tion against unauthorized or illegal use and sanctions therefore are not 
provided for in the case of misuse. Tort remedies are unlikely to be useful 
in aiding investigators to protect information from employee misuse or 
unauthorized access; special statutory and regulatory sanctions are required 
to provide investigators such effective control. Protection of this kind is 
provided for in the proposed LEAA regulations (28 CFR 22.29) where LEM 
is authorized to take legal actions leading to imposition of a fine of not 
to exceed $10,000 against any person who violates the provisions of confiden- 
tiality. The Commentary on the proposed regulations makes clear: 

25-149 



This would include the grantee organization, as well as particular 
individuals (including grantee, employees) committing violations. 
(Federa l R egiste r 40, 186:44037) 

The Commentary also makes clear that violations under transfer agreements 

are similarly covered by these sanctions. 

The exceptions to the privilege accorded by the DHEvJ Confidentiality 
Certificate include both an individual's right to waiver and exceptions that 
seem applicable only to bio-medical research sponsored by the DHEVJ agencies 
covered by the proposed regulations. LEAA provides for the same waiver of 
privilege. The reader is referred to our earlier discussion of waiver of 
privilege for a consideration of qualifications on the participant power 
to waiver. 

Apart from waiver of privilege, the question can be raised as to whether 
there are specific exceptions that should be provided for in any Confiden- 
tiality or Privacy Certificate. We previously discussed the obligation to 
disclose information on future crimes, at least those of a henious nature. 
There are other natters that merit consideration as well: 

1. Investigators should be permitted to transfer information identi- 
fiable to private persons to other persons cr organizations for research 
or statistical purposes, provided they are covered by and legally bound by 
the same provisions governing confidentiality and the disclosure of informa- 
tion. 

2. The Federal Government has a right and a duty to audit sponsored 
research. This probably means they must have access to information regarded 
as confidential to insure that at least research subjects were indeed sub- 
jects and were at least dealt with by certain procedures. The LEAA regula- 
tions provide for the sanction of government employees (28 CFR 22.29) if in 
any way they violate the provisions of section 524a of Pub. L. 93-83 Stat. 197, 

25-150 



the statutory authority for the regulations. 

3. Provision is made in the LEAA regulations for staff access to 
confidential data (28 CFR 22.21: (b)), and they are similarly subject to 
federal employee sanctions as well as the specific sanction provisions of 
the regulations. Absent some guidelines governing when staff shall have 
access to confidential information, there is a risk that such monitoring 
might be used for other than the legitimate purposes of protection and audit. 
If the sponsor's interest lies in utilizing the information for research or 
statistical objectives, that should be made a matter of contractual agree- 
ment on the grant or contract award rather than as a blanket authority 
granted all staff in the regulations. If this form of protection from 
staff is not provided, other modes should be consiclered 5 e.g., that such 
informatics, can be obtained only with the specific authorization of the 
Director and then only with a statement of the reasons why the information 
is requested. This latter provision should be a minimum requirement for any 
staff access to the confidential information in a research project. 

I nstituti onal C ontrol . The procedures for approving research £nd 
applying for a Confidentiality Certificate in the proposed DHEW regulations 
fail to make clear what role the Institutional Review Board or institutional 
sponsor has with respect to approving or disapproving the request for a 
Confidentiality Certificate. LEAA provisions provide only for approval 
of the research by the institutional sponsor. The question of whether or 
not an investigator should apply for a Certificate of Confidentiality however 
is germane to the considerations of the Board. Yet a Board should never 
withhold approval from a research project because a Certificate of Confi- 
dentiality is requested while it may do so when it regards a Certificate 
of Confidentiality essential to protect subjects "at risk." The reasoning 

25-151 



behind this proposed guideline for Institutional Review Boards is that 
investigators should be permitted to request protection whenever they regard 
it as essential to their own as well as to participant protection. At the 
same time both investigators and Institutional Review Boards have a re- 
sponsibility to protect the participants at risk and the IRB should have 
the power to require that one be requested if in their judgment it is 
essential for protection from hcrm. 



25-152 



V. SOME THOUGHTS ON THE REGULATION OF BEHAVIORAL SCIENCE INQUIRY 

The Role of Government Sponsor . 

To a growing degree, government has become the sponsor of biomedical 
and behavioral science inquiry. Support from both the private sector and 
from voluntary associations or foundations comprises an ever smaller part 
of the investment in research undertaken by employees of non-profit organi- 
zations. The obligations of government in regulating research and its re- 
sponsibilities for harm done are far from clearly defined. 

Current DHEW models of regulating biomedical and behavioral science 
inquiry on their face place the federal government in several protection 
roles: (1) those of protecting the government's general interest in the 
public's right to information and its particular interest in deriving 
specific benefits for its many functions (legislative, executive, and judi- 
cial) by setting program standards and objectives for research to qualify 
for funding; C2) that of protecting the rights of investigators from too 
much government interference by providing for institutional and peer re- 
view and making public the grounds on which applications are denied by 
the government agency; C3) that of protecting the rights of participants 
in research by establishing regulations requiring investigators to secure 
informed consent, protect subjects, etc. as a condition of their sponsor- 
ship. There are other ways that government research sponsors assume the 
legitimate mantle of protector, but the right to protect carries with it 
more than a responsibility to see that protection is adequate and in the 
public interest. 

Some of that "something more" is the responsibility it perhaps might 
assume in its role as specific sponsor. The use of experiments, the growth 

25-153 



of evaluation research, and the creation of many other interventions com- 
bining research and social action objectives originate at least as often 
perhaps with the government sector than with the "voluntary" community of 
scientists. Many research proposals and some procedures are shaped to a sub- 
stantial degree by government needs, government inducements, and government 
requirements. The government wants "cures" to physical, psychological and 
social ills and it shapes its programs and funding to do research on them — 
a proper role, to be sure. Yet the more a government by its policies and 
programs provides inducements that shape what investigators do, the more it 
must pay attention to its responsibility for the consequences of research. 

The more a government induces research that requires experiment, eval- 
uation and action research, or other types of research that include inter- 
ventions in social life, other than the interventions required by research 
procedures, per se_, the more likely it is to do harm as well as good. This 
is so, if for no other reason than that even with a low probability of harm, 
the more of that kind of research, the more harm that is done by research. 
But since risks from some kinds of research are greater than others, the 
more the government induces investigators into high participant risk re- 
search, the more burden it should also assume for failures and liabilities. 
Such burdens should not fall exclusively on investigators and their insti- 
tutional sponsors. If the government wants a cure to drug use and encour- 
ages research on drug use in human subjects, it has not only a strong obli- 
gation to protect those subjects, but it should incur some of the liabili- 
ties that may result from any harm done. Moreover, it should not readily 
do harm by disclosing confidential information or other means without over- 
riding public interest. Matters of tort liability not altogether aside in 



25-154 



American law, government must increasingly recognize its responsibilities 
for the actions of its agents — directly or indirectly — as well as a need to 
protect participant and investigator interests. This may mean that all 
parties to government sponsored research, including the government as spon- 
sor, must come to recognize there may be affirmative responsibilities as 
well as liabilities when harm is done — responsibility to help that may over- 
ride liabilities that might otherwise obtain — and a responsibility to share 
in the costs from tort actions or other forms of settlement. It may not be 
enough to encourage protection by "informed consent", leaving the risks to 
fall to those who consent and the liabilities to those who are the immediate 
principal in securing it. To know risks and to consent is no protection 
that harm will not be done. Knowledge affects choice, but knowledge cannot 
insure that whatever risk is taken will not fall upon the chooser for it 
must fall upon some! How is such harm to be dealt with? Only by tort ac- 
tions or the burden being borneby those who "freely chose" to consent? 
Perhaps that is not enough in a society where government encourages and ac- 
cepts legal affirmative duties in matters of harm. 

Individual and Corporate Actor Informed Consent . 

Attention has been called to the strong likelihood that when research 
is undertaken that involves corporate actors, information often must be 
obtained not only on corporate behavior, per se , but from those who are 
members of the corporate actor (corporate actors include all forms of col- 
lectivities from families and other small groups to bureaucratic organiza- 
tions). This necessity raises the problem that when the consent of the 
corporate actor (provided by some 'officially' recognized or legally author- 
ized person) and of all members who must be involved in the research, each 



25-155 



has the power to control the actions of others and subvert the research 
goals by refusal to participate, There are two principal types of dilemmas 
in this respect. The first occurs when the corporate actor grants consent 
but its employees have a right to refuse consent to participate in their 
role as employee. This might occur, for example, when the federal govern- 
ment requires that a prison security program be evaluated but the guards 
refuse to grant their consent. We have suggested that this might be re- 
solved by formal contract in making a grant that involves the corporate 
actor; the corporate actor then exercises an "employer right". Yet there 
might well be questions of limits to employer rights where research is 
involved and those matters must be explored. 

A second dilemma occurs when employees grant their consent but the 
corporate actor refuses to do so for matters that involve the corporate 
actor. We noted that this could arise when, for example, teachers might 
grant their consent to investigate styles of school administration and 
their effects on learning, but the school administration would refuse to 
grant their consent. What is at issue here is the right of employees to 
disclose matters that involve the corporate actor. There likewise is not 
a simple answer to that question. 

Both of the above dilemmas can occur for Inquiry in private as well 
as well as public organizations. Consider, for example, the interest that 
LEAA might have in studying policing and security by both private and public 
organizations (it provides funding for both types of research) . Studying 
policing in either the private or the public sector raises these dilemmas 
for requiring informed consent by all or only some parties of the corporate 
actor. 



25-156 



Without some reasonable balancing of rights to informed consent and 
rights to give and obtain certain kinds of information, much behavioral 
science inquiry on corporate actors becomes impossible. Need one add that 
in a modern complex society information on corporate actors may be of greater 
consequence than that sought on individuals apart from their roles for 
corporate actors? 

Journalists, Behavioral Scientists and the First Amendment . 

Both journalists and behavioral scientists seek to lay claim to First 
Amendment rights to protect their right to inquiry and to disclose informa- 
tion in the public interest. Both face serious difficulties in protecting 
their confidential sources of information by laying claim to a privilege 
from compelled disclosure of confidential information based on the First 
Amendment, though there is little argument that Congress "presumably has 
the power to fashion legislation such as a testimonial researcher's privilege 
to insure that the First Amendment rights of researchers are not infringed. 
Such legislation would apply both on the federal and state levels" 
(Nejelski and Peyser, 1975:B-28). 

There are, however, substantial differences between them in their ob- 
jectives, modes of acquiring information, and of dissemination of informa- 
tion. These must be kept in mind lest one assume their needs and require- 
ments for protection are similar. 

First, behavioral scientists always seek to protect any individual from 
any effect of public information while the journalist often seeks to do 
exactly the opposite. Generally, behavioral scientists seek to characterize 
aggregates, not individuals, and while journalists at times have a similar 
objective, often they do not. While both want to protect their sources of 



25-157 



information, their objective in doing so stems from quite different grounds. 
Journalists, not unlike some law enforcement agents, seek to protect the 
source of the information that is disclosed about some other individual or 
corporate actor. Their object may well be to do harm by disclosure, a harm 
that presumably benefits the public interest in information. Those grounds 
are almost always absent in behavioral science research, though there are 
exceptions, since behavior ists seek to protect all individual level data, 
both its source and any to whom the information applies. 

Second, journalists are not now regulated by a requirement of informed 
consent much as is the case for behavioral scientistswho do research in any 
public or private organization that lies outside the domain of government 
sponsored research. This leads to an imbalance in what information can be 
provided by whom, how, when and where. Were regulation to become unduly 
constraining for government sponsored research, that kind of inquiry might 
well shift to the private sector of behavioral science research and to the 
domain of journalists. Both shifts may have undesirable consequences. Were 
it to shift primarily to journalists, one would pay the cost that public 
information on many aspects of social life fall to their methods and proce- 
dures that lack the constraints of science. Were it to shift to the private 
sector, it might in the long-run jeopardize at least the study of government. 
One can think of other unintended and dysfunctional consequences as well. 

Third, behavioral scientists have more of a stake in sharing confiden- 
tial information for purposes of research than do journalists. Journalists 
do not ordinarily wish to share identifiable data; their sharing is done in 
the public press. Behavioral scientists thus have special problems of the 
transfer of identifiable data and its protection. 



25-158 



Need for Research on Informed Consent, Protection of Confidential Information , 
and their Regulation . 

It is axiomatic that intelligent and enlightened regulation of biomedical 
and behavioral science research depends upon careful research on matters that 
are to be regulated. We cannot, for example, reasonably choose among modes 
of regulation without knowing a great deal about their consequences for free 
scientific Inquiry — knowledge that must come, in part, from research. Now, 
it is paradoxical that once legal regulation is introduced it necessarily 
makes choices that constrain what investigators can do. Every form of eman- 
cipation carries with it its own form of enslavement. The consequences of 
regulation of behavioral science inquiry can be particularly destructive 
when they preclude or constrain unduly inquiry on processes of regulation 
and their effects on scientific inquiry. We shall briefly illustrate by 
several examples how this might easily be the case. 

1, Were regulation to prohibit some forms of what is called "deception" 
in behavioral science inquiry, it would also preclude studying whether decep- 
tion has the effects it is presumed to have and why, therefore, it was con- 
strained. We very much need more knowledge on the effects of withholding 
certain kinds of information in securing consent from participants and of 
ways that such effects, if they may harm, can be altered to reduce substan- 
tially risk from harm. There are no other animals on which many features 

of social life can first be investigated. 

2. Were regulation to preclude research on regulatory processes for 
behavioral research in any way, it will deny us that knowledge which we may 
need for intelligent regulation. A requirement that regulators grant their 
informed consent to be studied could well do just that. I note in passing 
that institutional sponsors and principal investigators perhaps have the 

25-159 



same rights to informed consent as all other particpants in research. Should 
they be permitted to preclude many kinds of research on self-regulation be- 
cause their informed consent is required? 

3. It is axiomatic that any system of regulation or control generates 
its own forms of deviance. It must be so also with the regulation of 
scientific inquiry. The proper study of those forms of deviance among 
scientists will require their protection as participants if valid and reli- 
able information is to be obtained on the "knowledge establishment". 
Knowledge of patterned evasion and other forms of deviation from the rules 
of regulation must be acquired for enlightened regulation. 

I note in passing that much remains to be known about the organization 
of the production and dissemination of knowledge, about the role of govern- 
ment in research, including its regulatory processes, and about the effects 
of scientific inquiry on the participants in research. Acquiring that 
knowledge should not be constrained by regulation so as to subvert the very 
goals of enlightened regulation. 



25-160 



VI. EPILOGUE 

A long dissertation — and this one perhaps needlessly sor-has an end 
as well as a beginning. The medium is the message; yet redundancy is not 
altogether lacking in value. What is it that we have tried to say? Is 
there a cautionary tale? 

This paper has treated of matters of regulating behavioral science 
inquiry by a requirement of informed consent. We have emphasized that 
informed consent is inextricably bound in behavioral science inquiry with 
the risks that attend disclosure or confidential information and stated 
the case for a need for the maximum possible legal protection for confiden- 
tiality. We likewise have emphasized that an elementary Human Subjects 
model of scientific Inquiry is often inapplicable when applied to behavioral 
science inquiry; it perhaps often is so as well for bio-medical inquiry, 
depending upon how the line is drawn among disciplines. 

Along the way, we have tried to maintain that some elements are more 
or less distinctive of behavioral science inquiry and how exceptions to 
these must be treated separately.' 

1. Behavioral scientists are interested in aggregative data for 
individuals, whether individual or corporate actors, not in individual 
level data. Exceptions arise for evaluation or assessment research, and 
their requirements may be different. 

2. Behavioral scientists generally intervene in the life of partici- 
pants only to acquire information from and about them; it is much less 
common that some form of intervention other than the research procedure 

is undertaken. Where it does occur, such as in experiments with human 
subjects and their collective life, separate consideration should be given 
to the problems that arise when a research role intersects with an inter- 

25-161 



vention role and to the consequences for research of deliberate interven- 
tion for purposes other than research. 

3. Behavioral science inquiry is generally low risk inquiry so that 
for much of it a requirement of informed consent seems unnecessary and 
burdensome. The main risk from harm in behavioral science inquiry arises 
solely from the disclosure of confidential information, the disclosure 
being the source of harm. There is an obligation to protect participants 
from that risk of harm by disclosure, one that can be obviated by a legal 
privilege against compelled disclosure and by legal penalties for un- 
authorized disclosure, misuse, or illegal use. 

Finally, we make note of the fact that legal regulation carries with 
it its own consequences that must be investigated by behavioral science 
inquiry if regulation is to be both enlightened and in keeping with constitutional 
imperatives. Regulatory constraints should make for as few constraints as 
possible in the study of the effects of regulation on free scientific 
inquiry, if in no other way than by making special provision for that kind 
of inquiry as an exception (and with due care for protection of all interests). 



25-162 



FOOTNOTES 

1. There are some statutory limitations on consent where proprietary 
interests prevail or when exchanges are privileged. 

2. The more unplanned the intrusion into private matters, the more 
complicated are problems of "informed consent" and "protection of 
the sources of information," matters treated below. 

3. Note that I do not argue that we have a more legitimate claim to 
"truth," whether or not it is made in the name of scientific inquiry, 
but simply that our claim to science opens us to political challenge. 

4. The concept written consent" applies to more than that it be written 
(one has an option to read it). The operable condition is that it be 
a signed consent to a written statement that is read either by par- 
ticipants and/or their representative (s) or by the investigator/ 
agent. In this sense "signed written consent" is a more meaningful 
designation of these procedures for obtaining informed consent. 



25-163 



REFERENCES 



American Law Institute 

1967 Restatement of torts: second tentative draft. 652. 

Benson, J. Kenneth and J. 0. Smith 

1967 "The Harvard drug controversy: a case study of subject 

manipulation and social structure." Gideon Sjoberg (ed.) 
Ethics, Politics and Sc*eial Research. Cambridge: Schenkman 
Publishing Co., 115-40. 

Crawford, Elisabeth T. and Albert D. Biderman (eds.) 

1969 Social Scientists and International Affairs. New York: 

John Wiley & Sons. 

Goffman, Ervifig 

1961 Asylums. New York: Doubleday & Co. 

Goldstein, Abraham S. 

1969 "Legal control of the dossier" in Stanton F. Wheeler (ed.) 
On Record: Files and Dossiers in American Life. New York: 
Russell Sage Foundation. 

Horowitz, I. L. and Lee Rainwater 

1970 "Journalistic moralizers." Trans-action 7 (May) : 5 -8. 

Marx, Gary 

1973 Muckraking Sociology. New Brunswick: Transaction Books. 

Nejelski, Paul and L. M. Lerman 

1971 "A researcher -subject testimonial privilege: what to do 
before the subpoena arrives." Wisconsin Law Review: 1085- 
1148. 

Nejelski, Paul and Howard Peyser 

1975 "A researcher's shield statute: guarding against the 

compulsory disclosure of research data" in Committee on Agency 
Evaluation Research, Protecting Individual Privacy in Evalu- 
ation Research. Washington, D.C.: National Academy of 
Sciences. 

Pound, Roscoe 

1915 "Interests of personality." Harvard Law Review 28 (February): 

343-65. 

Ruebhauser, Oscar M. and Orville G. Brim, Jr. 

1965 "Privacy and behavioral research," 65 Columbia Law Review 

(November): 1184-1211. 



25-164 



Shi Is, Edward A. 
1956 

1959 



The Torment of Secrecy. Glencoe : The Free Press. 

"Social inquiry and autonomy of the individual." Pp. 
114-57 of Daniel Lerner (ed.), The Human Meaning of the 
Social Sciences. Cleveland: The World Book Publishing Co, 



Sjoberg, Gideon 
1967 



"Project Camelot: selected reactions and personal reflec- 
tions." Gideon Sjoberg (ed.), Ethics, Politics, and 
Social Research. Cambridge: Schenkman Publishing Co., 
141-61. 



U.S. Dept. of Health, Education & Welfare 

1975 "Protection of human subjects: technical amendments" 

/45CFR, Part 46/, Federal Register 40 (Thursday, March 13), 
No. 50:11854-11858. 

U.S. Dept. of Justice, Law Enforcement Assistance Administration 

1975 "Confidentiality of identifiable research and statistical 

information: proposed regulations governing grant program 
information" /28 CFR, Part 22/, Federal Register 40 
(Wednesday, Sept. 24), No. 186:44034-37. 

U.S. Public Health Service 

1975 "Protection of identity - research subjects and patients" 

/42 CFR, Part 2a/ Federal Register 40 (Thursday, December 
4), No. 234:56692-56695. 



Warren, Samuel D, 
1890 



and Louis D. Brandeis 
"The right to privacy. 
193-220. 



Harvard Lav; Review 4 (December) 



25-165 



26 

THREE THEORIES OF INFORMED CONSENT: PHILOSOPHICAL 
FOUNDATIONS AND POLICY IMPLICATIONS 

Robert Veatch, Ph.D. 
February 2, 1976 



Abstract 



To understand the nature and definition of informed consent it 
is essential to understand the reason why we get consent in the first 
place. This paper outlines three alternative theories of informed con- 
sent. First, consistent with the traditional Hippocratic ethic of the 
medical profession, informed consent may serve the purpose of protecting 
subjects from harm. That current DHEW regulations require assuring 
informed consent only when subjects are at risk implies that this may 
be the foundation. However, if the objective is to protect subjects 
from harm this could be accomplished more efficiently by simply banning 
all non- therapeutic research. Furthermore, one must understand why one 
would be committed to protecting individuals from harm. It is suggested 
that it is because individuals are the possessors of individual rights 
including the right to self-determination. 

A second theoretical foundation for informed consent might be the 
classical utilitarian one: the greatest good for the greatest number. 
If the research enterprise depends on continued trust and confidence 
from the public, then consent might, in the long run, produce the great- 
est good by helping maintain the public trust in the medical research 
community. The difficulty with the second theory, however, is that it 
justifies too much. Often it might be the case that even greater good 
would be done if no consent were obtained and the rights of the individual 
were subordinated to the good of society. Once again a commitment to 
the rights of the individual requires that limits be placed on arguments 
based solely on consequences. 

The third theoretical foundation for informed consent we believe to 
be the most plausible one: the individual's right to self-determination. 
This right, basic to Western society and American political philosophy in 
particular, implies that invasion of the individual's body or privacy re- 
quires an informed consent. The consent cannot be dependent upon the 
claim that good consequences can come for the individual or society if 
the consent is obtained. 

Next the implications of the self-determination theory of consent 
for the standard of consent—for determining how much information must 
be transmitted for consent to be adequately informed--are examined. It 
is suggested that while if consequences were the foundation of consent 
professional standards might (but not necessarily would) be acceptable, 
the principle of self-determination requires the reasonable person stan- 
dard now being incorporated into informed consent court cases in many 
jurisdictions. This standard must be modified, however, when there is 
evidence the subject wants more information than the reasonable person. 
The practical implication is that for purposes of approval of the ade- 
quacy of the consent (and for judging whether the risks to the subject 
are justified by the potential benefits to the subject and/or others) 
an all lay committee of "reasonable people" is the only reliable basis 
of judgment. An advocacy system for introducing technical information 
to such a lay committee is proposed. 

26-1 



The implications of the self-determination theory of informed 
consent for Group I (competent, noninstitutionalized adults receiving 
medical care through private sources) are traced. It is suggested 
that only research protocols which would themselves compromise the 
subject's future capacity to consent should be prohibited by a re- 
view committee. For Group II subjects (those whose capacity or 
opportunity to consent is more problematic), however, self-determin- 
ation may be impossible (small children, the comatose), compromised 
(older children, the mentally incompetent), or de facto restrained 
(prisoners, clinic patients, and subjects of experiments where con- 
sent would destroy the research). Group II subjects should normally 
be used only when it is impossible to use Group I subjects. 

The question of overriding the principle of self-determination 
in cases where consent would destroy the experiment is considered 
concluding that only the principle of self-determination itself pro- 
vides a workable ground for waiving consent. The only other possible 
ethical grounding—a non-utilitarian theory of justice—may eventually 
provide an additional basis, but only when the application of the theory 
to medical experimentation is further developed. A national level re- 
view is proposed for any use of Group II subjects. 



Specific Recommendations 



The individual's right to self-determination should be recognized as 
the foundation of the requirement for informed consent. 

The present DHEW policy of requiring legally effective informed con- 
sent only if risk is involved should be abandoned. 

The "reasonable person" standard for judging the adequacy of consent 
should be formally recognized, except in cases where there is evidence 
that the individual subject would require a different level of infor- 
mation in order to exercise what he or she considers self-determination. 

An advocacy system of IRB consideration of protocols including adequacy 
of proposed consent forms should be adopted. 

The following additional items should be included in the current list 
of "basic elements" of an informed consent: 

a. A specific disclosure of the presence of a control 
group within the research design. 

b. A statement of the inconveniences as well as the 
risks and discomforts. 

c. Names of review and patient protection agents at the 
local and national level. 



26-2 



d. A statement of the basic rights of the subject. 

e. An explanation of who, if anyone, will be responsible 
for harms done. 

f. An explanation of the right, if any, to continue 
receiving- treatments found helpful. 

6. The words "for negligence" should be deleted from the exculpatory lan- 
guage prohibition. 

7. The "short form" of written informed consent in which a subject signs 
a statement that the information has been transmitted orally should be 
abandoned . 

8. The researcher or their staffs should never be expected or permitted 
to obtain the consent themselves. A specially trained individual not 
directly involved in the research should have that task. 

9. Experiments on Group I subjects with free and informed consent should 
not be disapproved unless they would compromise the subject's future 
ability to exercise self-determination. 

10. The term "proxy consent" should be abandoned. Parent or guardian 
"selection" or "approval" should be required for therapeutic research 
on children. Parents should have discretion within the limits of 
reasonableness to decide what should be counted as potentially thera- 
peutic for their wards. 

11. Parents or guardians should be permitted to approve non- therapeutic 
research on their wards whenever the research meets rigid criteria 
including no or minimal risk to the subject. 

12. Children and formerly competent patients should be able to exercise 
self-determination to rejecting non- therapeutic experiments. 

13. The formerly competent patient's wishes clearly expressed while 
competent should be determinative when the patient is no longer 
competent. 

14. Children and formerly competent patients should be able to exercise 
self-determination in accepting or rejecting therapeutic experiments 
and accepting non- therapeutic experiments if they are judged by a 
court to understand sufficiently the nature of the choice. 

15. Prisoners should not be treated as in any way having lost their 
capacity for self-determination. 

16. In cases where the de facto opportunity for prisoners to exercise 
self-determination is diminished because of the nature of the in- 
stitutional structure, this should be seen as a fault of the prison 
system, not of the prisoner. 



26-3 



17. A scheme should be considered whereby prisoners are compensated 

for research at rates comparable to other prison wages proportionate 
to time and risk while those doing research pay at a rate comparable 
to costs to obtain similar subjects outside the prison. The dif- 
ference should be made available to the prison population for edu- 
cational and recreational activities. 

18. Clinic patients should be treated as Group II subjects. It should 
be required that at least half of all subjects be recruited from 
other than clinic patient sources. 

19. In experiments where informed consent would destroy the research 
informed consent should nevertheless be required unless it can 
reasonably be presumed with at least a 95 percent level of certainty 
on the basis of specific empirical evidence obtained from mock- 
subjects drawn from the same subject population that the real sub- 
jects would not consider their uninformed participation a violation 
of their right to self-determination. 

20. More research should be undertaken on the adequacy of a non-utilitarian 
theory of justice for providing a criterion for overriding consent in 
specific cases where those less well off than the subject would bene- 
fit greatly. 

21. A special, second review at the national level of the quality of the 
consent (and the acceptability of the risk) should be required for 
all use of Group II subjects to assure that self-determination is 
preserved to the extent possible and that only reasonable risks are 
taken when self-determination is not possible. 



26-4 



Current government regulations require local review of all biomedical 
and behavioral research on human subjects supported under grants and con- 
tracts from the Department of Health, Education, and Welfare to determine 
whether subjects will be placed at risk and, "if risk is involved," whether 
"legally effective informed consent will be obtained by adequate and appro- 
priate methods. "■*• The logical implication is that informed consent of human 
subjects, insofar as it is mandated by DHEW regulations, is subordinated to 
and derivative from the goal of protecting human subjects from risk. If 
that is the case I believe the current requirement of informed consent 
rests on an inadequate base. 

My objective is to analyze the philosophical foundations of informed 
consent articulating three theories of informed consent and the implications 
of those theories for public policy. I shall argue that informed consent 
in its essence cannot be related to and derived from the notion of avoiding 
risks and/or producing good consequences, but must have an independent 
philosophical foundation. That foundation, so I shall argue, is the prin- 
ciple of autonomy—of self-determination. After exploring the three com- 
peting theories of informed consent, I shall then examine the implications 
for deciding how much information ought to be transmitted for consent to 
be informed. Finally I shall trace some of the policy implications first 
for informed consent from competent, non- institutionalized subjects and 
then for subjects who are legally incompetent, institutionalized, or 
both. 



26-5 



I. THREE THEORIES OF INFORMED CONSENT 

My assigned task is to discuss the nature and definition of informed 
consent. Although I am to focus on informed consent in various research 
settings, I am convinced that the same standards apply to clinical medicine. 
Thus some reference to cases and argument dealing with routine clinical 
care will be made. Also it might be appropriate to broaden that task 
slightly to discuss free and informed consent. That consent be both free 
and informed within certain limits seems necessary to make a consent ade- 
quate. It is important to realize how modern any notion of consent is 
whether or not it is qualified by the requirements that it be free and 
informed. In order to develop a theory of the foundations of consent it 
is essential to place the concept in a historical context. 

A. The Patient Benefit Theory of Informed Consent 

Traditionally experimentation in medicine was an integral part of the 
treatment of the patient. The Hippocratic authors placed medicine on a more 
naturalistic footing. In works such as The Sacred Disease the Hippocratic 
corpus demystifies diseases such as epilepsy. The author argues with re- 
gard to epilepsy, which had at the time been interpreted as being caused 
by sacred powers, that "It is not, in my opinion, any more divine or more 
sacred than other diseases, but has a natural cause, and its supposed divine 
origin is due to men's inexperience, and their wonder at its peculiar char- 
acter. "^ 

In spite of the fact that Hippocratic and Galenic medicine viewed 
medical problems as natural phenomena, these traditions did not rationalize 



26-6 



and systematize medical experimentation as we know it. This did not 
happen until modern times--the end of the eighteenth century. The 
Hippocratic physician would try out new remedies, but always in the con- 
text of treating a patient when routine therapies were not successful. 
It was not until well into the modern period that medical experimentation 
was undertaken in the sense of systematically designed research for the 
purpose of gaining medical knowledge. It is in part for this reason that 
consent is absent from the Hippocratic tradition. 

The ethic of the Hippocratic physician was (and to some extent still 
is) rooted in a special set of norms. According to Ludwig Edelstein the 
deontological (ethical) writings of the Hippocratic corpus reflect the 
philosophical, religious, and scientific views of the Pythagorean cult. 
The dominating ethical norm is that the physician's duty is to do what will 
benefit the patient according to his ability and judgment. 

Although the modern physician may not have read the Hippocratic Oath 
recently, the ethical norms are ones with which he is comfortable. The 
World Medical Association in 1949 adopted an International Code of Medical 
Ethics which includes an updated version of the patient-benefitting principle: 
"Under no circumstances is a doctor permitted to do anything that would weaken 
the physical or mental resistance of a human being except from strictly ther- 
apeutic or prophylactic indications imposed in the interest of his patient." 
It is generally thought that the Hippocratic Oath may be rather platitudinous. 
It is usually not recognized how controversial the principle itself is. 
For our purposes the primary implication is that all physician activity in- 



26-7 



eluding medical experimentation which is not undertaken for the benefit 
of the patient ought to be forbidden. 

Although the requirement of informed consent is not traditional in 
Hippocratic medicine, it is possible to justify such a requirement on 
patient -benefitting grounds. Indeed, if we recognize that judgments about 
what is beneficial to a particular patient will vary from patient to patient 

--mi 

depending upon the particular norms and values of that person, a strong 
case can be made that informing patients of treatment alternatives so that 
they can participate in or even control the decision-making process will in- 
crease the likelihood that patient-benefits will be maximized. Especially 
in cases of what might be called therapeutic research, that is research 
which simultaneously has two objectives, pursuit of knowledge and potential 
benefit to the patient, patients might plausibly maximize benefits by 
choosing between more conservative, standard therapies and experimental 
therapies on the basis of their own inclination to take chances and their 
faith in technological innovation. " Thus even in classical Hippocratic 
ethics informed consent may have an importance place. 

The decisive case for testing the relationship between patient-benefit 
and informed consent ought to be the special situation where someone (usually 
the physician) believes that getting patient consent will do harm to the 
patient rather than produce benefit. Testing a psychoactive compound for 
the treatment of schizophrenia is an example. Testing an experimental 
cancer drug on a terminally ill patient who does not know his or his diag- 
nosis or prognosis is another. If informed consent is a derivative princi- 



26-8 



pie designed to insure patient benefit, then whenever getting consent 
would do more harm then good it ought to be waived. This exemption is 
explicit in the 1971 FDA regulations for consent for use of an investi- 
gational new drug. Consent is to be obtained except where the investi- 
gators "deem it not feasible or, in their professional judgment, contrary 
to the best interest of such human beings" (i.e. the subjects). 12 it ^ s 
implied in the December 1, 1971, version of the DHEW Guidelines. Citing 
the important Halushka vs. University of Saskatchewan case, the guidelines 
specify that: 

Where an activity involves therapy, diagnosis, 
or management, and a professional/patient re- 
lationship exists, it is necessary "to recognize 
that each patient's mental and emotional condi- 
tion is important. . .and that in discussing the 
element of risk, a certain amount of discretion 
must be employed consistent with full disclosure 
of fact necessary to any informed consent. 13 

The draft regulations as revised and published in the Federal Register 
October 9, 1973,1^ and the final regulations as published May 30, 1974, 
also have no such exclusion. There are two possible explanations. First, 
the drafters of the regulations may have continued in their commitment to 
patient benefit, but held that consent will, on balance, be a practice 
which is patient-benefitting in the sense of protecting them from risk even 
in those cases where researchers believe that the patient would be benefit- 
ted more by not being told.l° If physicians were not capable of perceiving 
what would benefit the patient — in terms of the patient's own values--or 
what the patient's response to the request for consent would be, then the 
consent should be obtained even if, in the physician's judgment, it might 



26-9 



do harm. Alternatively they may have held that informed consent is 
so fundamental to the subject's rights in the therapeutic experiment 

that it must be retained even in cases where the physician (rightly) 

18 
perceives that it might do more harm than good. 

We are left with a confusion in the current guidelines. Consent is 
only required in cases where subjects have been found to be at risk im- 
plying that consent is somehow inherently linked with and subordinated to 
the primary goal of protecting patients from harm. On the other hand the 
researcher and the local committee are not permitted to waive consent on 
grounds of net patient-benefit. 

More doubt is cast on the adequacy of the patient-benefit grounds for 
informed consent when one realizes that the patient-benefit principle tra- 
ditional in medicine would rule out entirely all non- therapeutic experiments, 
that is experiments designed to gain knowledge useful to society, but with 
risks not justified on patient-benef itting grounds alone. It is clear that 
any physician who holds to the principles of the Hippocratic Oath cannot 
participate in any non- therapeutic research. To do so would be to act 
other than strictly for the benefit of his patient. 

The standard of the Hippocratic ethic, however, is the standard of a 
private, professional group. The ethical principles of private groups in- 
cluding medical groups ought to be of minimal importance to the National 
Commission for the Protection of Human Subjects. It is the purpose of the 
Commission to determine an ethically acceptable basis for human experi- 
mentation whether or not that basis is consistent with the ethical view of 



26-10 



any such private groups. Nevertheless it is of interest that the medical 
profession itself has abandoned its sole commitment to patient-benefit 
when it considers non- therapeutic experimentation. In 1954, five years 
after its general reaffirmation of the patient-benefit ting principle as 
the old grounds under which a physician could do anything to weaken the 
physical or mental resistance of a human being, the World Medical Associ- 
ation adopted its "Principles for Those in Research and Experimentation" 
which clearly approves research on healthy subjects. By 1962 in the 
Declaration of Helsinki the World Medical Association explicitly adopts a 
principle approving of non- therapeutic experiments "because it is essential 
that the results of laboratory experiments be applied to human beings to 
further scientific knowledge and to help suffering humanity." The American 
Medical Association has similarly approved of non- therapeutic research im- 
plying that it too has abandoned the Hippocratic or patient-benef itting 
ethic as its decisive norm. 

Of primary importance to the National Commission, however, is not the 
norms of private groups including professional groups, but publicly legiti- 
mated and accepted ethical standards. In one sense the Commission's task 
is the protection of human subjects. Clearly the easiest way to protect 
human subjects would be to ban all non- therapeutic research. Consent 
might be justified for therapeutic experiments on patient -benefit ting 
grounds, but it is not clear that consent should always be required even 
in those experiments. It would be required when, and only when, patients 
would be more likely to benefit by giving consent. 



26-11 



I take it as accepted by the Commission and by most reasonable people 
in our society that at least some non- therapeutic experimentation is ac- 
ceptable. If that is the case, however, the sole task of the Commission 
cannot be the protection of human subjects. Likewise the sole foundation 
of informed consent cannot be patient-benefit. In fact when patients con- 
sent for non-therapeutic experimentation, and for much therapeutic experi- 
mentation as well, consent seems to function more to cancel the implicit 
obligation of the physician that he will strive only to benefit the patient 
and protect him from harm. Logically, if consent functions to waive the 
obligations of the norm of patient-benefit, it cannot itself be grounded 
in patient-benefit. 

Since the logical implication of the patient-benef itting principle — 
that research can be done only for the benefit of the individual patient/ 
subject--is strongly counter-intuitive, that awareness may be sufficient 
to reject the patient-benef itting principle as the foundation of informed 
consent. The principle itself, however, implies even better reason. One 
should ask why it is that physicians or others would feel a duty to act 
only so as to benefit the patient and protect him or her from harm. It 
seems the most plausible answer is that the individual human being (who is 
sometimes in the patient role) is seen as an autonomous entity with special 
claims against the rest of us — claims normally called rights. This aware- 
ness that the individual human is uniquely endowed with rights is variously 
expressed in the Western tradition by saying that humans were created in 
the image of God (Genesis) , are to be treated as an end and never only as 
a means (Kant), or simply that they are endowed by their Creator with 



26-12 



certain inalienable rights. If, however, the individual person is always 
to be treated as an end and never only as a means, it must mean more than 
that simply others must avoid taking risks with that individual. To be 
a person is to be an autonomous individual, the possessor of rights. 

This notion of the human as an autonomous individual who is the pos- 
sessor of rights is not explicit in the patient-benefitting Hippocratic 
tradition. In fact, the explicit notion of individual rights is, like 
the principle of informed consent, uniquely modern. It is understandable 
that modern medical professionals who remain Hippocratic in their ethic 
would tend to link consent to risks and benefits for the patient. For 
those more explicitly committed to individual rights, however, that Hippo- 
cratic view limited to benefits and risks to the patient will be an in- 
adequate foundation for the patient-physician relationship. It will be 
even less adequate for the relationship between researcher and subject. 

B. The Social Benefit Theory of Informed Consent 

If it seems implausible that the primary purpose of informed consent 
is to protect patients against risk—although it may in some instances 
function in this way--some may find its purpose in the more generally ac- 
cepted ethical theory of utilitarianism. According to this view, as artic- 

20 
ulated by Bentham, Mill, and others, that course of action is right which 

produces the greatest good for the greatest number. Experimentation would 
be justified according to this view if, all things considered, more good 
than harm came from the experiment and more net good came from the experi- 
ment than any other plausible course of action. Holders of this view are 



26-13 



sophisticated in recognizing that good cannot be limited to economic 
considerations. Aesthetic, cultural, religious, and psychological 
goods and harms would have to be taken into account. The deprivation 
of liberty to a small group of subjects would not necessarily be justified 
by great goods to a great number of others provided that one counted the 
deprivation as a very grave harm. 

If non- therapeutic experiments are to be justified at all, there 
almost has to be some element of social benefit included in the justifi- 
cation. As long as experimenting in medicine was in the context of patient 
care, that is when experiments were therapeutic in intent, social benefits 
of the research were ancillary. With the modern period, however, when 
rational design of research in the pursuit of knowledge gave independent 
grounds for experimenting, benefits to others became significant and at 
the same time introduced a potential conflict with the benefit-to-patient 
norm. 

It is often not realized how modern a phenomenon systematically designed 

experimentation is. Experimental medicine is often dated from William 

21 
Harvey's publication of his studies of animal circulation in 1628. 

While this work exemplifies research for the pursuit of knowledge, even 
this did not involve systematically controlled research exposing human 
subjects to such risks as double blind placebo administration. Systematic 
investigation of this kind is a nineteenth and even more a twentieth cen- 
tury phenomenon. 



26-14 



By the beginning of the nineteenth century research for the good of 
society rather than the individual patient began to be defended. Thomas 
Percival was asked by the trustees of the Manchester Infirmary to prepare 
a code of ethical conduct for physicians to help them overcome an internal 
dispute. The Code, which was published in 1803, has become the foundation 
of Anglo-American physician ethics. In the document Percival is explicit 
in justifying medical experimentation on broader public benefit grounds: 



Whenever cases occur, attended with circumstances not 
heretofore observed, or in which the ordinary modes 
of practice have been attempted without success, it is 
for the public good , and in especial degree advantage- 
ous to the poor (who, being the most numerous class of 
society, are the greatest beneficiaries of the healing 
art) that new remedies and new methods of chirurgical 
treatment should be devised. But in the accomplish- 
ment of the salutary purpose, the gentlemen of the 
faculty should be scrupulously and conscientiously gov- 
erned by sound reason, just analogy, or well authenti- 
cated facts. And no such trials should be instituted 
without a previous consultation of the physicians or 
surgeons according to the nature of the case. 22 



There is not any hint of a patient consent requirement, but there must 
be previous consultation with "the gentlemen of the faculty." Given the 
context of the tensions at the Manchester infirmary at the time it is plausi- 
ble to see this consultation as serving more general social purposes includ- 
ing protection of the hospital's image as well as making sure that the ex- 
perimentation is "for the public good, and in especial degree advantageous 
for the poor."' 

In Claude Bernard, the father of modern medical experimentation, the 
justification of experimentation in terms of the general good it will do 



26-15 



goes even further. In his famous Introduction to the Study of Experimental 
Medicine in 1865 he boldly claims that "Christian morals forbid only one 
thing, doing ill to one's neighbor. So, among the experiments that may 
be tried on man, those that can only harm are forbidden, those that are 
innocent are permissible, and those that may do good are obligatory. "23 

The question remains, if the underlying justification of medical 
experimentation is that it will produce good social consequences on 
balance, what is the place of informed consent? For the most part it would 
appear that more research could be done more efficiently to produce more 
good consequences if the consent requirement were eliminated. There is one 
justification for the consent requirement, however, even on social benefit 
grounds. It may be that a general policy of research for social good with- 
out patient or subject consent would soon create public suspicion and 
severe handicap for the research enterprise. Subjects would resist situ- 
ations where experiment was likely. A requirement that all subjects must 
give consent would assure lay people that they would not be unknowing 
subjects of medical research. The general consent rule might simply be a 
clever way of promoting long run social utility. ^ The fact that social 
usefulness of information per se is sufficient to make consent expendable 
requires some commitment to a social utility theory. 

A test case would be an experiment which, by its very nature could 
not be done with consent, for example psychological studies of perception. 
Under these circumstances no good could come if consent were obtained, 
while some good might come if the research were permitted under controlled 



26-16 



non-consent circumstances. The social utility theory of consent is sup- 
ported by the fact that current DHEW guidelines permit waiving of the 
consent requirement when "that use of either of the primary procedures 
for obtaining informed consent would surely invalidate objectives of 
considerable immediate importance. 

There is some evidence that the original introduction of consent 
for research in the nineteenth century had as one of its purposes the 
preservation of the research for the social good which could come. In 
1822 William Beaumont began his famous experiments on gastric physiology. 
His work was made possible because one Alexis St. Martin suffered an acci- 
dental shotgun wound leaving a fistula (a direct opening) to the stomach. 
St. Martin signed a written contract with Beaumont agreeing to be his 
"covenant servant" for one year. ° St. Martin was destitute and destined 
to be deported as an alien unless he could find some way to support himself. 
His agreement with Beaumont was the solution. Its quality is primitive by 
late twentieth century standards. Being a binding agreement it seems in 
part designed to guarantee that once Dr. Beaumont had invested in the sub- 
ject, St. Martin would continue with him until the results could be demon- 

27 
strated to his colleagues. 

While there are instances where the consent seems to function to 
serve the general social welfare rather than protect the patient, for the 
most part that does not seem to be its primary purpose. In fact while, 
in contrast to the patient-benefitting principle, the principle of social 
benefit legitimates non- therapeutic research, it seems to legitimate too 



26-17 



much. According to the principle, research which will on balance serve 
the general welfare must be done. Only in cases where the consent facil- 
itates the research would it be necessary. 

It is the social benefits principle which, together with some strangely 
ethnocentric values, led to the Nazi experiments and the decisive challenge 
to the bonum commune defense of medical experimentation. " It became clear 
at Nuremberg as never before that fundamental human rights were at stake 
in non- therapeutic research justified on the grounds of the greater good 
for society. 

C. The Self-determination Theory of Informed Consent 

If maximizing social benefits leads to unacceptable violation of the 
rights of the individual subject, the drafters of the Nuremberg Code had 
two options. They could return to the older Hippocratic formula insisting 
the research be undertaken only when it is justifiable in terms of benefit 
to the patient/subject. Alternatively they could hold to the legitimacy 
of research for the good of the community and control against excesses by 
articulating some limiting principle. The authors chose the latter course. 

The second principle of Nuremberg makes clear that social benefit has not 

29 
been abandoned. But informed consent is introduced as the first principle 

clearly not to facilitate social benefits, but as a check against them. 30 
We are led to an inescapable conclusion. Anyone who imposes an informed 
consent requirement on medical research for a reason other than the instru- 
mental value that consent might have in furthering research for the common 
good must recognize that individual subjects have claims against the society, 



26-18 



claims so strong we call them "rights." There must be rights of the in- 
dividual which have standing even against the claim that the greater good 
would be served if those rights were compromised. 

This should not sound strange at least for one steeped in Anglo- 
American political philosophy. Americans have learned that all are en- 
dowed by their Creator with certain inalienable rights including life, 
liberty, and the pursuit of happiness. The Constitutional guarantee to 
due process before deprivation of liberty cannot be sacrificed simply be- 
cause the good of the community would be served. 

Although informed consent may, upon occasion, promote benefits to the 
patient and/or benefits to society, it is clear that its primary purpose 
stands over against these consequentialist objectives. Informed consent 
functions as a waiver of certain individual rights for the good of self 
(patient/subject benefit) or others (social benefit). In particular it is 
the individual's right to self-determination which makes informed consent 
necessary for all invasions of the body or even invasions of one's privacy. 
The principle of autonomy—the right to self-determination—provides an 
independent foundation for the informed consent requirement, a foundation 
much more solid than the justifications of informed consent which occasion- 
ally can be derived, from concern over protection of the individual against 
risk or protection of the society by protecting the larger research enter- 
prise. It is because of this self-determination foundation that consent 
giving can be seen as a negotiation of a contract. 



26-19 



There is strong legal evidence that this self-determination theory 
of informed consent is the philosophical foundation of the consent re- 
quirement. It was not always the case in American jurisprudence, however. 

32 33 

As late as 1871 and again in 1895 major court opinions dealing with 

experimentation omitted any requirement for consent. But in 1914 Justice 

Cardozo articulated forcefully the patient's right to self-determination 

as the basis for surgery: 



. . .Every human being of adult years and sound 
mind has a right to determine what shall be done 
with his own body; and a surgeon who performs an 
operation without his patient's consent commits 
an assault, for which he is liable in damages.... 
This is true except in cases of emergency where 
the patient is unconscious and where it is neces- 
sary to operate before consent can be obtained....^ 



The self-determination principle was reaffirmed as the foundation of that 
consent clearly in the famous Natanson v. Kline in 1960 where Justice 
Schroeder argued: 



Anglo-American law starts with the premise of 
thoroughgoing self-determination. It follows 
that each man is considered to be master of his 
own body, and he may, if he be of sound mind, 
expressly prohibit the performance of life- 
saving surgery, or other medical treatment. 



The principle of consent was applied to experimentation as opposed 
to routine treatment in Fortner v. Koch-*° in 1935. 

There is some evidence that the authors of the DHEW guidelines recog- 
nize that informed consent as well as other rights are independent of the 
question of risks and benefits to subject and society. Whenever review is 



26-20 



mandated, review committees have three substantive tasks: to determine 
that (a) the risks to the subject are so outweighed by the sum of the 
benefit to the subject and the importance of the knowledge to be gained 
as to warrant a decision to allow the subject to accept these risks; (b) 
the rights and welfare of any such subjects will be adequately protected; 

and (c) legally effective informed consent will be obtained by adequate 

37 
and appropriate methods. That there are three co-equal requirements of 

review makes clear that the right to consent as well as the other "rights 

and welfare" mentioned in clauses (b) and (c) are not derived from the 

notion of risk to the subject. If they were it would be more appropriate 

to say that the review committee must see that the risks to the subject 

including violations of rights are so outweighed.... The DHEW guidelines 

follow traditional theories of rights in American political philosophy by 

recognizing that rights of individuals including the right to consent are 

independent of consideration of risks. 

Yet if that is so it is paradoxical that (b) and (c), that is the 
protection of rights including the right to consent, are to be assured by 
review only in cases where the subject is at risk. Logically it would make 
sense to require that a determination of risks is sufficiently outweighed 
by potential benefits only in cases where subjects are at risk, but it is 
fundamentally illogical to require that the rights of the subject are to 
be protected only in cases where the subject is at risk. 

If it is correct that the principle of self-determination is the pro- 
per foundation of a theory of informed consent and that rights of subjects 



26-21 



exist independent of consideration of risks and benefit, then there seems 
to me to be only one possible explanation of the subordination of deter- 
mination of protection of subject rights to the determination that the 
subject is at risk. To make this clear let me suggest two forms the notion 
of self-determination might take as a basis for informed consent. 

The first I would call the weak theory of self-determination. Ac- 
cording to this view an individual has the right to self-determination re- 
garding invasion of his body or his privacy only when exercising that self- 
determination will materially affect his welfare. In this case an individual's 
right to self-determination is limited to the area of risk-taking. On the 
other hand we might speak of a "full theory of self-determination." If an 
individual is always to be treated as an end and never only as a means, 
that individual is the possessor of autonomy in all areas of his life, not 
simply in cases where material risks and benefits are at stake. In fact 
at least within limits we shall consider below he possesses the right to 
self-determination to make choices which are contrary to his own interests. 
Put in these terms it seems most implausible that the rights of life, 
liberty, and the pursuit of happiness would carry the proviso "only in 
circumstances when risks and benefits are involved." Many of the cases 
where one exercises the right to self-determination are cases where risks 
and benefits as we normally think of them are not at stake. The consti- 
tutional rights to liberty and privacy cannot be so limited that they only 
apply in cases where a committee has determined that the subject is at risk. 
The right to confidentiality for instance, which is normally subsumed 
under (b) can not be conditional on the subject's being at risk. 



26-22 



If one examines the list of basic elements of informed consent 
one discovers that some of the items included are not directly linked 
to subject's calculation of risks and benefits. For instance, suppose 
human blood were needed to develop a test for sickle cell anemia and 
trait in fetuses. Blood samples are to be obtained from adults with 
and without sickle cell disease or trait for purposes of developing the 
test. The eventual objective of the research is to develop the diagnosis 
in time so that all fetuses with disease or trait could be aborted thus 
improving the gene pool. If the blood were obtained as remainder blood 
from routine diagnostic work, it is difficult to conceive of any risk to 
the subject in having it used in the study. He will never be at risk to 
be aborted as a fetus and, if he already has reason to believe he and his 
spouse do not have the disease or carrier status, his offspring could not 
even be affected in any direct manner. Yet it seems that some people 
might object to the purposes of this research. Still more might object 
to having their blood used for this study without their consent. That 

presumably is why the first basic element of informed consent includes 

39 
a fair explanation of the purposes of the research. 

A second example of a piece of research where no plausible risk to 
the subject is at stake and yet subjects might plausibly want the oppor- 
tunity to consent to the research involves a study using human placentas 
for basic physiological study. Placentas normally routinely discarded in 
the delivery room would be salvaged for research purposes. It could plausi- 
bly be argued that the women from whom the placentas were taken were not 
at any risk from the study. They were not being asked to modify the 



26-23 



delivery procedure at all. Yet it seems plausible that many women would 

want to be told that the placenta was to be used in this manner. Some 

may object; others would gladly consent — if they are given the opportunity. 

A third example is the patient studies with medical instructions 
mentioned by Levine. ° Even if one had no reason to fear direct risk of 
ridicule, one might plausibly object to the concept of "compliance" on 
the grounds such research is often built on the unstated hypothesis 
patients are wrong in their judgment not to follow medical advice (or 
doctor's orders). If one believed that such patient judgments were often 
rational given the value system and world view of the patient, but also 
that those studying "compliance" did not share that belief, then one might 
want to refuse to participate in such compliance studies on the grounds 
that they were misguided, had the potential of leading to erroneous con- 
clusions, and, if nothing more, paternalistic in their conception. Such 
a patient might reasonably want the opportunity to participate in such 
studies because he or she objects to the purpose of the study rather than 
the risks. 

Even the use of autopsy material and severed organs and limbs for 
research raises questions which certain individuals would find potentially 
meaningful or useful. For instance Orthodox Jews might object on theologi- 
cal grounds to autopsy and subsequent research use unless they were directly 
linked to the saving of a particular life. Objections to the purpose 
of the research as well as idiosyncratic objections based on unique systems 
of belief and value can be made independent of risk/benefit considerations. 



26-24 



The point is a logical one: if the right to self-determination is the 
proper basis of the consent, it is illogical to make the exercise of 
that right dependent upon the subject's being at risk. 

II. THE STANDARD OF REASONABLY INFORMED CONSENT 
If the proper theoretical foundation for informed consent is the 
principle of self-determination or autonomy, this ought to have impli- 
cations for our understanding of informed consent in various research 
settings. Before looking at those implications for specific settings I 
want to connect this self-determination theory to a question which has 
received much attention recently in the legal literature: the question 
of the standard to be used in deciding how much information ought to be 
transmitted for a consent to be informed. 

Before looking at some plausible alternative answers it is necessary 
to put aside one red herring, the standard of "fully informed and free 
consent." Researchers sometimes argue that it is impossible to give the 
subject enough information for consent to be "fully" informed. To do 
so would require an infinite amount of information—or at least a full 
medical education. Since consent cannot be fully informed, they argue, 
the physician should select particularly important items to transmit, but 
not strive for an impossible standard. 

I claim this is a red herring because no one, or at least no one 
who has thought about it, really demands "fully" informed consent. It is 
not only impossible, but would be terribly tedious. It is more plausible 



26-25 



to require that all potentially useful or meaningful information be trans- 
mitted. I say meaningful as well as useful since, as in the case of the 
placentas, some information might be seen as meaningful even if no con- 
crete use can be made of it. 

We are still left with the question of how much information ought 
to be transmitted if the standard is that which is potentially useful or 
meaningful? Most believe that the traditional standard was some variant 
on the standard of the profession: what the reasonable physician would 
have disclosed under the circumstances. The court case which is often 
cited is Natanson v. Kline, especially the qualification that: 



The duty of the physician to disclose, however, is 
limited to those disclosures which a reasonable medi- 
cal practitioner would make under the same or similar 
circumstances. How the physician may best discharge 
his obligation to the patient in this difficult situa- 
tion involves primarily a question of medical judgment.., 
the physician's choice of plausible courses should not 
be called into question if it appears, all circumstances 
considered, that the physician was motivated only by 
the patient's best therapeutic interests and he pro- 
ceeded as competent medical men would have done in 
similar circumstances .43 



The standard of the profession has been challenged widely in court 
cases in ten states and in many articles in legal journals. I presume 
that this legal development, which I take to be the most exciting theoretical 
conceptual shift in the ethical and legal dimensions of medicine in the 
twentieth century, will be thoroughly discussed in the legal documents on 
informed consent being prepared for the Commission. My task is to point 
out the philosophical implications and the connection of this shift to the 



26-26 



three theories of consent I have developed. 

From Justice Schroeder's opinion in Natanson v. Kline it appears 
that patient-benefit is an underlying concern for setting the standard of 
how much information is to be transmitted. The physician is to be moti- 
vated only by the patient's best therapeutic interests. Even if one as- 
sumes that patient-benefit is the primary foundation of informed consent-- 
an assumption which we have challenged and which would rule out all non- 
therapeutic experiments—it would still be necessary to make further as- 
sumptions in order for the standard of the profession to be used in deter- 
mining how much information must be transmitted. It would be necessary 
to assume that the physician or physician/researcher was the proper person 
to determine what is in the patient's best interest. 

This presumption appears to rest on an old model of medical decision- 
making, one which sees medical choices as essentially technical matters 
based on the scientific skills of the physician. If we can presume that 
the values underlying the decision are agreed upon and the only question 
is which course would promote the desired end, then those with technical 
competency would appropriately be able to decide what would be in the 
patient's interest. 

It seems clear, however, especially in cases where the patient is to 
choose between a conservative approach using an established therapy and a 
more innovative course with an experimental therapy, that we cannot agree 
on the values underlying the decision. 



26-27 



If consent for experiments were based upon either subject-benefit 
or broader societal-benefit and we can assume there is some expert in 
deciding what is beneficial other than the subject himself, then it would 
be plausible to limit the information transmitted to those items which 
the expert considered necessary in deciding what would be beneficial. 
Thus, apparently beginning from a patient-benefit ting motive, Garnham 
proposes substitution of a physician's informed judgment for that of the 
patient's. ° 

Even if this were the theoretical underpinning of the consent, how- 
ever, it is unlikely that the medical professional would be the appropriate 
expert at least unless he had sufficient psychological skills to decide 
what would benefit and what would harm. In cases of consent for experi- 
mentation the subject-benefitting consideration which might require getting 
consent or place limits on getting that consent is primarily the psychologi- 
cal benefits to the patient/subject. If the patient/subject would be dis- 
tressed at not knowing what was being done then consent should be obtained. 
If the patient/subject would be distressed at hearing the details of the 
research or its purposes then it should not be—according to this theory. 
But it would normally be psychological experts who could most appropriately 
make that judgment. If, on the other hand, consent is rooted in a bonum 
commune defense, then the appropriate expert would be someone such as a 
sociologist skilled at judging community sentiment about the research 
enterprise. In neither case would the (non-psychiatric) physician have 
the relevant skills. 



26-28 



If the theory behind informed consent is the individual's right 
to autonomy or self-determination, however, then the appropriate standard 
for how much information should be transmitted should not be related to 
any of these professional skills. The standard ought to be the amount 
of information necessary for the subject to exercise self-determination, 
that is the amount of information the subject would find useful or mean- 
ingful, independent of whether the researcher or the research community 
would find that information useful or meaningfu 1.46a If the objective of 
the consent is to promote self-determination, then it is the subject pop- 
ulation itself which must provide the standard for determining how much 
information is to be transmitted in order to exercise self-determination. 

Earlier I said, with regard to the Natanson v. Kline case that most 
believe that this case puts forward the traditional standard of the pro- 
fession. In fact a close reading of it reveals it is much closer to the 
reasonable person standard than most realize. In the earlier quotation 
a key phrase was omitted, one which is often overlooked. It says the 
standard of the profession is to be used in judging the adequacy of in- 
formation "So long as the disclosure is sufficient to assure an informed 
consent." The fact that the patient benefitting criterion and the standard 
of the profession are specifically qualified in this way suggests that 
Judge Schroeder must have had something more in mind. 

This shift to lay standards --determining what the reasonable person 
would want to know before consenting to research or therapy- -is now be- 



26-29 



coming the basis for judging whether a consent is informed. The "reason- 
able man" (or "reasonable person") standard is now explicit in court 
cases in many jurisdictions beginning with Berkey v. Anderson in California 
in 1969 in which it was argued that: 

We cannot agree that the matter of informed consent 

must be determined on the basis of medical testimony 

any more than that expert testimony of the standard 

practice is determinative in any other case involving 

a fiduciary relationship. We agree with appellant 

that a physician's duty to disclose is not governed 

by the standard practice of the physicians' community, 

but is a duty imposed by law which governs his conduct 

in the same manner as others in a similar fiduciary 

relationship. To hold otherwise would permit the 

medical profession to determine its own responsibilities... ' 

There are radical implications for local experimentation committees of the 
reasonable person standard for determining how much information is necessary 
for consent to be informed. I have recently completed a study of those 
implications, the full text of which is available for the Commission's 

A Q 

use. Here I shall summarize the conclusions. One task of such committees 
is to determine if legally effective informed consent will be obtained. 
If, however, self-determination is the foundation for making that decision 
and therefore the reasonable lay person's judgment is necessary for deciding 
how much information that is, then a committee which is skewed in its com- 
position away from that representative reasonable lay person will not be 
capable of deciding whether the consent proposed is adequate. If committees 
include research scientists in greater proporation than in the general public 
and those research scientists predictably give atypical answers to such 
questions as whether they would want to know certain information and whether 



26-30 



the risk is "worth it" given the potential benefits of the knowledge, then 
such committees will give predictably unreliable answers to such questions. 
It is not just that the committee must include some lay representation. 
Rather in order to adequately carry out this one particular function of 
deciding what the reasonable lay person would want to know, the committee 
must be made up entirely of lay people (or alternatively composed in such 
a way that special professional biases are neutralized). Of course, for 
other functions, such as establishing the risks, professional skills are 
needed. This remains a fundamental dilemma which, as I see it, can only 
be resolved by having two committees, (one lay, the other professional) 
or by reducing professionals to a strictly technical advisory capacity. 
The capacity of lay people to make such judgments and the fact that those 
judgments differ from professionally staffed URB's is documented by Norman 
Fost's study of a "surrogate system" for informed consent. y His proposal 
differs from mine in that the lay people would not actually function as a 
committee. 

Even if those with special medical skills and the unique value commit- 
ments which accompany those skills are limited to the role of technical 
advisors to an all lay committee, there is reason to doubt that it is even 
theoretically possible much less practical to transmit information to the 
committee in a "neutral" manner. Perhaps we should consider shifting to 
the advocacy system for such review of protocols. Under such a system 
technical staff selected purposefully because of their inclination for and 
against the research enterprise would be charged with the tasks of pre- 
senting the best technical cases for and against the protocol under consid- 



26-31 



eration. The lay committee, having heard the cases would, after an op- 
portunity to request further information and explanation, exercise their 
judgments as reasonable people about the adequacy of the consent (and pre- 
sumably also whether the risk to the subject was justified by the poten- 
tial benefit to subject and/or others). 

There is one additional problem with the use of the reasonable person 
for assuring that subjects will receive the information they consider use- 
ful or meaningful. What of the subject who is "unreasonable" in the techni- 
cal sense the term "reasonable" is used in the law? What of the subject 
who desires more or less information than the reasonable person? If the 
goal is providing enough information for adequate self-determination, 
surely the reasonable person standard is not adequate for such subjects. 
If there is any reason to believe that the particular patient or subject 
wants more information than the reasonable citizen, then the patient or 
subject's own standard of certainty must apply. If a subject communicates 
to researchers that he wants more information of a particular sort than 
the reasonable person would, there is an obligation of the researcher to 
give that additional information, if the subject is to continue to be 
part of the experiment .^0 At least for non- therapeutic experiments, it 
ought to be sufficient for the researcher to drop such a subject from the 
research. For potentially therapeutic experiments the abandonment of the 
patient/subject by the physician/researcher when he or she has a treatment 
potentially beneficial to the patient/subject would raise the same problems 
of any physician abandonment. The obligation to give ample notice and 
reference of another physician willing to provide the treatment might be 



Z6-32 



required—at least within the limits of reasonableness. That few other 
physicians may be capable of giving the experimental treatment makes the 
case even more difficult than the normal therapeutic situation. 

The case of the patient/subject who communicates that he or she wants 
less information than the reasonable person would be a more difficult prob- 
lem. Since I am contending that the principle of self-determination is 
the one which ought to be used in requiring informed consent, it might be 
possible to argue that the patient/subject should have the right to deter- 
mine that there is some information he or she would rather not have. That, 
of course, does not make the patient/subject's request for less information 
an ethical request. If the human is ethically responsible for decisions 
about his or her own medical future, it can be seriously questioned at the 
ethical level whether one is justified in waiving information necessary to 
make a consent informed. Nevertheless in cases of routine patient care 
such a waiver might be taken as sufficient to relieve the physician of an 
obligation to disclose. 

In case of experimentation, however, I am not convinved that conclusion 
can be reached. I would still oppose imposition of information on the sub- 
ject against explicit instructions from the subject. The researcher has 
another option, however. The investigator can turn to other subjects. 
That would seem to me to be the preferable course. 

If the standard for an adequately informed consent is the standard of 
the reasonable person (modified in cases when there is evidence the subject 
differs from that standard), we are still left with the question of substance: 



26-33 



what information must be transmitted? The exact content must be deter- 
mined by reasonable representatives of the public on a case by case basis. 
Some basic elements of informed consent, however, spell out the kinds of 
information necessary. In addition to the six elements currently included 
in DHEW guidelines, *■ there are some additional elements I believe a reason- 
able person would want to know before giving an adequately informed consent. * 
These include: 



1. A specific disclosure of the presence of a con- 
trol group within the research design. 53 

2. A statement of the "inconveniences" as well as 
the risks and discomforts. 54 

3. Names of review and patient protection agents 
including the person in the institution and 
the person at the federal level who should be 
contacted if the subject has further questions 
about the experiment. 

4. A statement of the basic rights of the subject. 

This should include not only the presently re- 
quired statement of the right to withdraw with- 
out prejudice, but the right to access to the 
alternative treatments, mention of which is 
now presently required. 

5. Explanation of who, if anyone, will be respon- 
sible for harms done to the subject. This should 
include an explanation of who, if anyone, will 

be responsible for both anticipated harms the 
risk of which was included in the consent, and 
negligent and non-negligent, but unanticipated 
harms . 

6. An explanation of the right, if any, to continue 
receiving treatment found helpful to patient/ 
subject. 



In addition the current DHEW regulations prohibit "exculpatory language 
through which the subject is made to waive, or appear to waive, any of his 



26-34 



legal rights, including any release of the organization or its agents 
from liability for negligence. "^^ I see no reason why the prohibition 
should be limited to liability for negligence . I would propose dropping 
the "for negligence" so that exculpatory language waiving or appearing to 
waive liability is prohibited whether it is liability for negligence or 
some other liability. 

These new elements which I believe are necessary for a consent to be 
adequately informed should be added to those currently in the list of six 
elements in the DHEW guidelines . I also endorse many of the elements pro- 
posed by Robert J. Levine-* 6 including especially the requirement that there 
should be a clear invitation rather than a request or demand, that the sub- 
ject be informed why he has been asked to participate in the study, and 
that there be a suggestion to the prospective subject that he or she might 
wish to discuss the proposed research with another before consenting. I 
believe I disagree with Levine's final element — consent to non-disclosure-- 
but only in that he does not specify the limits of the non-disclosure. I 
shall discuss below such limits when considering research which could be 
destroyed if informed consent were obtained. 

I also share Levine's skepticism with the "short form" of the written 
consent document. The use of a short written form which has the subject 
affirm that items have been explained orally serves no useful purpose es- 
pecially since a written version must be on file with the IRB. In some 
cases it leads to suspicion about what is actually communicated not neces- 
sarily because the researcher is not trusted, but because staff actually 
obtaining the consent may accidently omit certain items. I also share 



26-35 



Levine's doubts about general consent forms for categorically related 
research. It also fails to provide evidence of the actual consent should 
litigation arise. I believe both short forms and general consent forms 
should be excluded as not assuring legally effective consent. I would 
thus favor deletion of paragraph 46.10(b) from the May 30, 1974, version 
of the DHEW policy. 

Finally, there is one procedural problem in the mechanism of getting 
consent which I think needs correction. Whether a regular or a short 
written form is used, it seems to me to be too much to ask of a researcher 
that he negotiate the consent with the subject himself. The commitment 
of the researcher to the worthiness of the project and the justification 
of the risk on grounds of benefit to the subject and/or others is, or ought 
to be very high--or he ought not to undertake the project in the first 
place. The conflict of interest is too great for a normal person to bear. 5° 
I would favor the use of those with no direct involvement in the protocol 
to negotiate the consent with the potential subject. (An alternative might 
be the negotiation first with the researcher and then with one hired as an 
advocate for the opposition to the subject's participation.) 

III. THE IMPLICATIONS OF THE SELF-DETERMINATION THEORY 

OF CONSENT 

What then are the implications of the self-determination theory of in- 
formed consent for subjects in different research settings? The implications 
will depend upon the setting. In this final section I shall take up, first, 
subjects which I would call Group I subjects , competent non-institutionalized 
adult subjects receiving private medical care. Then I will turn to the 
implications of the self-determination principle for Group II subjects , 



26-36 



subjects whose capacity to consent is compromised in some way. 

A. Group I Subjects 

The theory that consent for participation in research is rooted in the 
principle of self-determination has implications first for those subjects 
ideally placed to give consent which is relatively free and informed. If 
we limit ourselves to Group I subjects for non- therapeutic research, i.e. 
subjects who are mentally competent, non-institutionalized, adults who 
receive health care through private channels, we have probably limited our- 
selves to the group most capable of exercising self-determination. Some 
implications are apparent even for this group. 

First, if self-determination is the objective, then consent is neces- 
sary for research independent of the risk involved. Second, recognizing 
that self-determination is always a relative phenomenon, determining how 
much information will be necessary for autonomous decision-making insofar 
as the goal is promoting self-determination will have to be based on standards 
as close as possible to the subject's own. Normally this will mean the con- 
sensus of reasonable lay persons, but modified as necessary to bring the 
standard in line with ways in which the subject may be known to differ from 
the reasonable lay person. 

Third, there may be limits to that to which the lay person may accep- 
tably consent. This is a problem which I have not taken up because it 
takes us beyond the nature and definition of informed consent. Even though 
informed consent may be rooted in a theory of self-determination, there may 
be other constraints on participation in research beyond the right to self- 



26-37 



determination. In a society as thoroughly committed to individual liberty 
as ours is, those limits may be very broad, but there may nevertheless be 
limits . 

Even John Stuart Mill in On Liberty recognized at least two limits 
to liberty. The first is harm to others. 59 it i s unlikely that experi- 
ments could be banned when subjects give free and informed consent on the 
grounds that they would do harm to others, but such objections are con- 
ceivable, as for instance, a viral transduction experiment attemption to 
manipulate the human genetic code where both researcher and subject are 
adequately informed and willingly agree to participate in the study. 

Although it is not generally recognized Mill also places a second 
limit on liberty: the limit of prohibiting surrender of one's own liberty. °° 
It is possible that some free and informed consents by subjects of Group I 
would be seen as surrendering too much liberty, in volunteering to take 
great risk of death for marginally valuable results or volunteering for 
experimental brain manipulation, for instance. Such consents could be at- 
tacked as not truly free or not adequately informed, but the mandate of 
local review committees would permit such prohibitions even if the consent 
were considered free and informed. The committee must determine not only 
if there is informed consent, but independently, whether the risks to the 
subjects are adequately outweighed by the potential benefits to subject and/ 
or others. If the right of self-determination for the competent, non- 
institutionalized adult is taken seriously, the instances where that right 
should be compromised on paternalistic grounds will be extremely limited 



26-38 



if not non-existent. Occasionally experimentation with free and informed 
consent might be rejected on the grounds that the subject's liberty cannot 
voluntarily be surrendered. For the most part, however, such rejection 
would have to be based on the state's role as protector of the welfare 
of its citizens. There are limits to liberty in our society—until re- 
cently men could be drafted to risk life and limb--but in even those cases 
the conscription was done in the name of protecting liberty itself. Block- 
ing of experiments in which there is free and informed consent solely on 
the independent grounds of paternalism seems rarely, if ever, justified. 

B. Group II Subjects 

Although I recognize the dangers of overgeneralization, I would like 
to call all groups of subjects where the capacity to consent is problematic 
Group II subjects. I call them Group II because I believe they should be 
considered for human experimentation only in cases where research on the 
first group is impossible. For the most part I mean impossible; not merely 
inconvenient. If the foundation of informed consent is self-determination, 
then consent is impossible in cases where self-determination is impossible. 
In all cases of Group II subjects self-determination is either impossible 
or constrained. 

1. Children 

The clearest example of the impossibility to exercise self-determin- 
ation is the very young child. In the small child consent has a very limited 
applicability because self-determination is very limited. I believe, for 
the most part, it is a mistake to speak of "proxy consent" for experiments in 
children. Rather we should make clear precisely what is at stake: 



26-39 



research without subject consent justified if at all on some other grounds. 
For therapeutic research on young children we must fall back on a principle 
which we have seen is highly suspect: the principle of patient-benefit. 
Because, by definition, therapeutic research proposes experimental treatments 
about which there is no consensus as to the benefits, it is never possible 
to justify such experiments on general patient-benefit grounds. 

Parental "approval" or "selection" of subjects for such therapeutic 
research is essential for two reasons: first, under the norm of patient- 
benefit parents in their guardian role are obligated to serve the best 
interests of their children. They are in the best position to protect 
their interests. 

Second, since in cases of therapeutic experiment there is no consensus 
about what would be in the child's best interest, parents are given very 
limited discretion to choose values upon which decisions may be made for 
their children. It is in this second role that parental approval takes 
on the aura of a consent. Parents in our society are given limited authority 
to exercise their own self-determination about the values of their offspring. 
They are permitted to select religious training, parochial education not 
valued by the majority, vegetarian or "organic" diet, and other values not 
generally shared by the ordinary person. In this one sense parental "con- 
sent" is the appropriate term. That parental consent is very limited is 
apparent from the willingness of courts to intervene if parental determination 
of values deviates very far from the social consensus, if, for instance, 
Amish parents were to choose no school rather than, as in the case of parents 



26-40 



choosing parochial education, a minority school. 

One of the areas in which parents are permitted to exercise some 
discretion is in encouraging the child to make minor contributions to the 
general welfare or the welfare of specific others. Parents may encourage 
the child to contribute a small portion of his allowance to the Red Cross, 
for instance. The limits of parental discretion are quite narrow, however. 
The child is not the property of the parent. Non- therapeutic research may 
be one area where parental self-determination is to be tolerated within 
these narrow limits. 

One main line of opinion holds that no child or other non-consentable 
can ever be the subject of non- therapeutic research because he cannot consent, 
and a human should never be treated as a means rather than an end unless 
consent is obtained. ' This, however, is a highly individualistic under- 
standing of individual responsibility. If in addition to being an end in 
himself with inalienable rights, the individual is seen as a member of a 
social community, then certain obligations to the common welfare may be 
presupposed even in cases where consent is not obtained. The dangers of 
balancing individual rights with obligations to serve the common welfare 
are great especially in cases where consent cannot be used as a mechanism 
to judiciously waive those rights. In very special cases, however, where 
truly no risk or minimal risk to the subject is envisioned and when infor- 
mation to be obtained from non- therapeutic experiments on children would be 
of great value which can be obtained in no other way, there must be some 
contribution to the general welfare which can be expected without consent 



26-41 



which the reasonable person would find required. This is not to say that 
social benefits can cancel individual rights, that patient benefit can be 
traded interchangeably for social benefit. It is rather to say that it is 
reasonable to treat the individual, nonconsenting subject as a means to an 
end under very limited and circumscribed conditions. 

Even if it is emphasized that this is not the same as making the utili- 
tarian trade off, there are great dangers in such a proposal. For this 
reason, parental approval of non- therapeutic research in such special cases 
should be required first, as the best check to make sure that individual 
rights are not unduly compromised and, second, to permit parental self- 
determination to be decisive in deciding whether their offspring will make 

CO 

a justifiable, but nonconsenting contribution to the general welfare. 

All of this is said with regard to consent and parental approval for 
very young children where no self-determination is possible. It seems to 
me to be valid also for older children when potentially therapeutic experi- 
menting is contemplated. There are two special problems, however. For 
children old enough to communicate when non- therapeutic experimenting is 
contemplated consent is possible although consent which may be neither free 
nor informed. Since the child has nothing to gain, it seems reasonable that 
his uninformed refusal should nevertheless be determinative. In addition 
to free and informed parental approval and the constraints on that approval 
(for the reasons given above) uninformed consent of the child should also 
be required in non- therapeutic experiments. 



26-42 



Finally for therapeutic experiments for older youth, some real self- 
determination may be possible. H If a youth could exercise self-determin- 
ation, I see no reason why that should not take precedence over parental 
judgment. The problem, of course, is determining that the youth's judg- 
ment is free and informed. Two solutions seem possible: generally lowering 
the age of majority so that youth can consent on their own or making such 
judgments on a case by case basis. For some medical treatments lowering 
the age of consent for the particular treatment (such as venereal disease 
and birth control services) may be justified. In general, however, I 
think it is wiser to keep the age of consent for medical treatment high — 
at least 18. To adopt a general lower age for consent for medical treatment 
might mean substituting the persuasion of the medical profession or others 
with influence, for the authority of the parent. For therapeutic experi- 
menting and for treatments not covered by a specific statute lowering the 
age for consent, case by case adjudication of the judgment of the youth 
disagreeing with parental judgment seems appropriate. 

2. Formerly Competent Adults 

Formerly competent adults --mental patients, the comatose, and the 
senile--are, for purposes of consent very similar to children in that they 
lack the capacity to exercise self-determination. They differ, however, 
in several important regards. First, since they are formerly competent, 
at one time in the past they could exercise self-determination. In some 
instances formerly competent individuals may have expressed disapproval 
of experimental cancer treatments or expressed a desire to contribute to 
scientific knowledge of their particular disease. While in children the 



26-43 



parental judgment about what is in the child's interest would be taken 
as decisive within limits, in the case of the formerly competent adult 
the situation is more complex. There is currently great debate about 
whether statements about medical treatments written while competent 
ought to remain valid when one is no longer competent."-' Some argue 
that if the incompetent patient were able to have an opinion now when 
he is incompetent, his opinion would have changed; that it is impossible 
for the healthy individual to anticipate the experience of terminal illness 
or chronic mental incapacity. On the other hand, what judgment could be 
more reliable about the wishes of the now incompetent one? I take it to 
be an assault on the right to self-determination of the competent one to 
hold statements made while competent as unacceptable expressions of the 
best estimate of what one would want when and if incompetent. 

There is a second problem with incompetents lacking in the case of 
children. While statute normally specifies when a child is a minor in- 
capable of giving consent for medical treatment and research, the defin- 
ition of incompetency in adults is much more tenuous. The circle defining 
those who are incompetent is shrinking rapidly. Many patients including 
some committed to mental institutions formerly considered incompetent to 
accept or refuse medical treatments are now being permitted to do so. 

An institutionalized woman in depression was permitted to refuse the 
continuation of electroshock therapy. A 60-year-old committed schizo- 
phrenic was permitted to refuse a breast biopsy for diagnosis of a possible 
malignancy on the grounds that she might die, that it would interrupt her 



26-44 



movie career, and prohibit her from having further children. ' In New 
York the state Health Code explicitly specifies that mental patients 
are permitted to refuse experimental treatments. ° Non- therapeutic 
experiments on mental patients should be under the same restrictions 
as for youth. In rare cases where they might be justified when the in- 
formation cannot be gained in any other manner, when there is no risk or 
minimal risk, when there has been informed approval by a guardian, and 
uninformed pro forma consent by the subject. Therapeutic experiments ought 
to be conducted under consent conditions similar to those on a youth. 
Guardian approval or judicial determination that the patient is exercising 
adequate self-determination ought to be required. Expressions made while 
competent, however, should be taken as evidence of the patient/subject's 
wishes. Whether to permit pro forma refusal by the patient to be decisive 
over against guardian approval as is required in New York I find a difficult 
question. In general, though, the New York policy seems acceptable since 
by definition the benefits are problematic. 

C . Prisoners 

Although prisoners are frequently grouped with children and mental 
patients as difficult cases when discussing consent, the problems created 
in the case of prisoners are radically different. It is frequently noted 
that prisoners may not be free psychologically because of the coercive nature 
of the choices offered in the prison setting. It seems to me the only solu- 
tion to that constraint on the prisoner exercising self-determination in 
consenting to participate in experiments is the restructuring of the insti- 
tution so that the choice to participate in experiments is more on a par 
with other options. This might require increasing income opportunities from 



26-45 



other forms of prison employment. The only proposal which I find plausible 
within the present prison structure is that those wanting to do prison re- 
search pay to the prisoners—as a group — fees comparable to what it would 
cost to obtain subjects outside of prison while the individual subject 
would receive an amount determined to be proportionate to other income 
producing opportunities considering risk and time involved. The difference 
could then be used by the prison population for educational or recreational 
purposes of their own choosing. " 

The larger problem for consent prison research from the perspective 
of a self-determination theory of consent seems to me to be in a different 
area. In contrast with children, the senile, and the mentally incompetent, 
there is no reason to presume that prisoners lack the capacity for self- 
determination. If self-determination is a fundamental right in our society, 
then we should be very cautious in infringing upon that right even in the 
name of protecting the individual's welfare. While prisoners do not lack 
the capacity to consent, however, a social judgment has been made that 
their right to self-determination should be greatly constrained. Depending 
upon one's theory of imprisonment infringing upon self-determination is 
thought justified either for protection of the public interest, for rehabil- 
itation, or for punishment for previous wrongs done. Thus the prisoners 
general presumptive right to self-determination has been compromised. 

The implication for prisoner consent depends upon the theory of im- 
prisonment. If the sole purpose of imprisonment is to protect the public— 
to get the criminal off the streets — then it is hard to see why the prisoner's 



26-46 



right to consent to research should in any way be compromised in principle. 

For rehabilitation exercise of the right ought to be encouraged. If, 

however, retribution is the basis of the imprisonment conceivably that 

right could be limited. If one of the functions of prison research is to 

give the prisoner an opportunity to make amends for previous wrong to 

society and to regain his sense of personal worth, then some might argue 

that such a "privilege" should not be given. That may be the view of the 

American Medical Association in their statement in 1952 in which they 

state: 

...Whereas, some of the inmates who have participated 
have not only received citations, but have in some in- 
stances been granted parole much sooner than would 
otherwise have occurred, including several individuals 
convicted of murder and sentenced to life imprisonment... 
Resolved, that the House of Delegates of the American 
Medical Association express its disapproval of the 
participation in scientific experiments of persons 
convicted of murder, rape, arson, kidnapping, treason, 
or other heinous crimes, and also urges that individuals 
who have lost their citizenship by due process of law 
be considered ineligible for meritorious or commendatory 
citation. . . .70 

Regardless of whether human beings are imprisoned for purposes of 
protection or retribution, I cannot accept this argument for depriving 
them of self-determination in consenting to experimentation. While some 
constraints on self-determination may be necessary, those constraints must 
be carefully circumscribed. There can be no general loss of basic human 
rights. Until recently being a prisoner brought what was called "civil 
death," the loss of all rights. That radical infringement of rights has 
been abandoned, however, in favor of a much more limited deprivation of 
rights. Self-determination in choices about medical treatment—including 



26-47 



experimental treatment --and about making humanitarian acts ought not 
to be limited any more than it would be for other competent adults. If 
constraints are necessary because prisoner consent is feared to be de 
facto coerced—even when the economic incentive is removed-- that is a 
failure of the system which ought not to be attributed to any deprivation 
of the prisoner's right to self-determination in this area. It is particu- 
larly serious if prisoners are deprived of their right to potentially 
therapeutic experimental treatments for this reason. 

4. Clinic Patients 

Like prisoners, clinic patients do not in principle lack the capacity 
to consent, but may be coerced into consenting because of serious con- 
straints on their options for receiving health care. I believe that clinic 
patients—patients whose opportunities to self-determination may be limited 
although their capacity should not be— should be treated as Group II sub- 
jects just as children, the mentally incompetent, and prisoners are. How- 
ever, since they do not lack the capacity to consent and their rights would 
especially be jeopardized if they are deprived of any opportunity to parti- 
cipate in therapeutic research, I reject what at first seems plausible: 
banning of all research on clinic patients. Rather I would favor as a 
check on de facto coercion a general requirement that at least half of 
all subjects be drawn from sources other than clinic patients. 

5. Subjects in Experiments Where Consent Would Destroy the Research 

There is a final group of subject's whose right to self-determination 
is potentially compromised: subjects in experiments where getting informed 
consent would necessarily destroy the experiment. Research in psychology 



26-48 



of perception where the design requires deceiving the subject as to 
the purpose or procedures would be an example. An experiment to test 
the difference in response between subjects receiving a placebo in a 
drug study who are told there is a placebo in the design and those who 
are not told would be another. 

First, it is important to distinguish between cases where consent 
would necessarily destroy the experiment and cases where it would simply 
make the experiment more, difficult. Omitting consent for the convenience 
of the researcher seems to me to be never tolerable. Further, in some 
cases it may be possible to be clever in designing protocols so that de- 
ception or other lack of informed consent would not be necessary. In some 
cases it is believed that consent would destroy the experiment without any 
adequate grounds for that belief. For instance, I know of no evidence 
that telling subjects there is a placebo in the design of a drug study 
(never, of course, telling them whether they are receiving the placebo) 
would harm the experiment. It is possible that the reports from the sub- 
jects would be different--they may be more cautious in their reporting; 
but I know of no convincing argument that the results obtained would be 
any less valuable. In fact it could be argued that they would be more 
valuable, because the subjects would generally be on guard to make accurate 
reports. I believe that in all designs where a placebo is used, it should 
be a requirement of informed consent to state that there is a placebo in 
the design. 

There will still, however, be research which cannot be done without 
deception of the subject. We have seen that current DHEW requirements 



26-49 



justify such omissions of informed consent. ' *■ We deduced that a principle 
of social-benefits was necessary to omit consent in such cases. But not 
just any social benefit would justify the consent omission. That is clear 
in the DHEW guidelines. First, omissions are justified only when the con- 
sent would "surely invalidate objectives of considerable immediate impor- 
tance," when "reasonable alternative means for attaining these objectives 
would be less advantageous for the subjects, and when the risk to any 
subject is minimal." Thus there is already a clear recognition that not 
just any social benefit is sufficient to waive the consent. In fact the 
requirement that reasonable alternative means for attaining these objectives 
would be less advantageous for the subject is a requirement which would 
possibly permit some therapeutic research deception, but would apparently 
prohibit all psychological studies using deception in normal subjects 
since the deception study is of no advantage to the subject whatsoever. 

I think simultaneously we need to go further and have gone too far. 
I believe we may have gone too far if we rule out all deceptive experiments 
where only the good of society is at stake. At the same time we have not 
gone far enough in specifying what principles and what tests would justify 
waiving of the consent. Hans Jonas, in discussing non-disclosure in cases 
where disclosure would destroy the research, also takes a position that the 
subject's rights may be violated even though no harm is done: "Only supreme 
importance of the objective can exonerate it, without making it less of a 
transgression. The patient is definitely wronged even when not harmed."'^ 
Jonas seems to limit his argument to non-disclosure in cases of research 
on patients (which he calls "an outright betrayal of trust"). It seems, 



26-50 



however, that the argument works equally for the non-patient subject. 

Jonas also does not develop the argument about what would be sufficient 
to justify such a non-disclosure. It seems to me that the one instance 
where such non-disclosure would be justified, the one principle which would 
justify violating the right to self-determination, must be rooted in the 
concepts of self-determination and trust themselves. If, and only if, 
there is good empirical evidence that the subject would not consider the 
deceptive withholding of information a violation of that trust, would I 
find the non-disclosure acceptable. 3 If, and only if, we can reasonably 
presume on the basis of specific empirical evidence that reasonable sub- 
jects would not have objected to participating in the experiment without 
their consent, would such omission be justified.'^ I believe that is an 
empirically testable proposition. I would suggest that for any experiment 
which would be destroyed if informed consent were obtained, researchers 
should be required to draw a sample from the subject population proposed 
in the protocol, explain to these mock-subjects the research in mind in- 
cluding the benefits as well as the deception involved. Subjects should 
then be asked whether they would have considered their right to self- 
determination violated- -whether they would have objected to being an un- 
informed participant, had the research actually been done on them without 
their informed consent. If we can predict, based on that sample, using a 
reasonable confidence limit such as 95 percent that other subjects drawn 
from the same population would not object, then it seems to be a justifiable 
compromise of the real subjects' right to self-determination. It is indeed 
a compromise because even at that level of certainty one subject in twenty 



26-51 



predictably would object to being made part of the experiment. Nevertheless 
this seems to me to be a reasonable compromise. If a lesser number of 
mock-subjects, say only a majority, approved, that could hardly justify a 
presumption that all or virtually all of the uninformed subjects would have 
approved of the deception. 

The Need for Special Review of Consent in Group II Subjects 
Because consent with all group II subjects is problematic procedures 
are needed to assure that these subjects' right to self-determination is 
not violated--insofar as such capacity exists. I would favor a special 
second level review of all research involving Group II subjects, a national 
board charged with reviewing all protocols using the same criteria as local 
boards. There is good reason to suppose that Group II subjects, especially 
clinic patients, are now used as subjects because their use is the path of 
least resistence. Establishment of an additional level of review would 
provide additional incentive to use subjects whose capacity and/or opportun- 
ities to consent is not as problematic. There is sufficient evidence that 
local committees vary tremendously in their standards for approving consents, 
that such precaution seems necessary to protect the rights and welfare of 
these special groups of subjects whose ability to give effective informed 
consent is so problematic. 

The August 23, 1974, draft of proposed regulations for protection of 

human subjects includes an alternative to a national level review of con- 

75 

sent. That draft proposed additonal protection for research involving 

fetuses, abortuses, pregnant women, and in vitro fertilization. It was 
proposed that a local "consent committee" be established to monitor consents. 



26-52 



The proposal could be expanded to cover all of what I have called Group 

II subjects. I favor such a committee and am disappointed that it was 

76 
dropped from the policy adopted August 8, 1975. The argument given 

against such committees — that it would cost too much in time, money, 

and social benefits--cannot be a definitive argument for jeopardizing the 

rights of subjects unless one is committed to a utilitarian calculus of 

social costs and benefits. This argument, put forward by a distinguished 

group of researchers should at least not be taken as definitive by the 

representatives of the public since researchers are legitimately expected 

by society to have a unique value commitment to the social benefits of the 

research enterprise. 

My own position, however, is not that the consent committees would 
impede social progress; I am not convinced that they would stand in the 
way of well designed and executed research. Rather I am concerned that 
a second group completely independent of the special characteristics 
institutionalized into the local IRB and exposed sufficiently to the 
special problems of consent in problematic cases be given an opportunity 
to review the quality of the consent as well as the judgment that the jeo- 
pardy to the subject's interests, rights and welfare is justified by the 
potential benefit to the subject and/or others. I would see this most 
effectively done by a national committee. This seems to me to be a com- 
promise preferable to the well articulated and often reasonable demands 
that research be banned entirely on children, prisoners and other Group II 
subjects . 



In principle I see one ground other than the principle of self- 



26-53 



determination which would justify experiments on human subjects. This 
would apply to all experiments including experiments requiring non- 
disclosure. It is often held, I believe correctly, that humans have a 
prima facie obligation to promote justice independent of the consequences. 

This has led to an exciting contemporary debate about the meaning of 

78 
justice. The theory developed by John Rawls and the more egalitarian 

79 
variants of that theory would consider some practices fair and even 

right which might deprive individuals of their right to self-determination. 
The justification, however, is not in the production of good social con- 
sequences on balance, but in promoting justice. I believe a theory of 
informed consent could be derived from this theoretical work which would 
provide a very limited basis for sacrificing the rights and interests of 

the individual for the benefit of certain others who are less well off 

80 
(but not society in general). I have purposely not developed such a 

formulation for this paper, relying instead on a theory of self-determination 
because I am not convinced that the theoretical work on the theory of jus- 
tice is sufficiently advanced that it could be incorporated into practical 
public policy making by the National Commission without the risk of errors 
which would jeopardize the rights of individual subjects. I see the develop- 
ment of the implications of this theory for informed consent an important 
research problem for the next few years. 

I am convinced that biomedical and behavioral research, both thera- 
peutic and non- therapeutic , is of tremendous importance to the individual 
and to society. In fact, we might reasonably speak of the individual's 
right to such research. To do so, however, involves a recognition that 
fundamental rights, especially the individual's right to autonomy or self- 



26-54 



determination, which must provide the basis for free and informed consent. 
To fail to get such consent will do far more than jeopardize important 
benefits to the individual and to society, it will jeopardize those funda- 
mental rights themselves. 



26-55 



Department of Health, Education and Welfare, Office of the 
Secretary, "Protection of Human Subjects," Federal Register 39 (number 
105) Part II, May 30, 1974, pp. 18914-18920. See especially paragraph 
46.2, p. 18917. 

2 In fact I would stand with those who favor even more caution 
in getting consent for clinical care and so-called therapeutic experi- 
ments than non- therapeutic research because of the strong, sometimes 
coercive, interest a sick person has in maintaining the approval of 
medical professionals. See Robert J. Levine, "The Nature and Definition 
of Informed Consent in Various Research Settings," December 1, 1975, 
paper prepared for the National Commission for the Protection of Human 
Subjects of Biomedical and Behavioral Research (hereafter cited as "Na- 
ture and Definition"). 

3The alternative is to pack the requirements that consent be 
free and informed into the definition of consent. The Oxford English 
dictionary has as its first definition "voluntary agreement to or 
acquiescence in what another proposes or desires; compliance, con- 
currence, permission." The ambiguity is apparently within the word 
itself. The first part of the definition includes the requirement of 
voluntariness while the latter synonyms do not. I prefer defining 
consent as the naked permission leaving to the adjectives to specify 
that adequate consent must be free and informed. I believe that adds 
clarity and functionally leads reviewers to the proper questions to ask 
about a particular consent. 

^Hippocrates, The Sacred Disease , in W.H.S. Jones, ed., English 
edition Hippocrates II, p. 134. 

5 Ibid. 

^Ludwig Edelstein, "The Hippocratic Oath: Text, Translation, and 
Interpretation" in Ancient Medicine (Johns Hopkins Press, 1976), pp. 3-63, 

'The oath states the patient-benefitting principle twice, first 
with regard to dietic measures (one of the three elements of Pythagorean 
medicine): "I will apply dietetic measures for the benefit of the sick 
according to my ability and judgment." Later a more general form of the 
patient benefit-principle is repeated, this time without the explicit 
statement that the standard is to be the physician's own judgment, al- 
though this time the notion of intention is introduced: "Whatever houses 
I may visit, I will come for the benefit of the sick, remaining free of 
all intentional injustice..." See text in Edelstein, ibid . , p. 6. 



26-56 



8ln addition to the fact that the patient-benefitting principle, 
if taken seriously, excludes all experimentation not done in the inter- 
ests of the patient, it can also be criticized as being excessively in- 
dividualistic (concentrating only on benefit to the individual, isolated 
patient) and paternalistic (using the physician's own judgment as the 
standard of reference). That it focuses exclusively on benefits and 
harms to the exclusion of other ethical questions such as right and 
obligations inherent in action, is a problem we shall discuss below. 

°It has been recognized that in special circumstances so-called 
non- therapeutic research might be undertaken on healthy subjects in the 
name of patient-benefit. If an individual were at high risk to a par- 
ticular disease testing a vaccine on that person in the face of an epi- 
demic might be justified on the grounds that the risk to the patient him- 
self was less in conducting the trial of the vaccine than in letting the 
patient go unprotected. (See Paul Ramsey, The Patient as Person (New 
Haven: Yale University Press, pp. 15-16.) Here, however, the principle 
of patient-benefit remains the norm. The judgment to include the patient 
in the test is made on strictly patient-benefitting grounds without con- 
sideration of benefit to others which might come from the knowledge gained. 

l^See Robert J. Levine's paper for the National Commission for the 
Protection of Human Subjects for a more extensive discussion of the dis- 
tinction between therapeutic and non- therapeutic research. 

11-Here I must explicitly reject the argument that some procedures 
undertaken where the two objectives of benefitting the patient and gain- 
ing knowledge both are present should not be seen as experimental. Ber- 
nard M. Dickens, for instance, argues that "If no orthodox treatment exists 
for the patient's condition (either because of the condition's novelty or 
because the orthodox treatment has become discredited by advances in medi- 
cal knowledge) the physician's innovation will be nonexperimental." (Ber- 
nard M. Dickens, "What is a Medical Experiment?", Canadian Medical Associ- 
ation Journal 113 (Oct. 4, 1975), pp. 635-639, quotation from p. 636.) 
That seems to me to simply be a flagrant corruption of the term "experi- 
ment." It is one thing to say that under these circumstances there is 
no known better alternative; it is another to say that the trial of an 
unproved treatment is not experimental. Especially since he believes 
that a lower standard of consent may be required when a treatment is not 
experimental (a position which I reject in any case), much is at stake in 
the definitional debate. Certainly the patient ought to have the option 
of doing nothing in these circumstances, an option which by definition 
has not been shown to be any worse than the novel therapy. 

12'*Food and Drug Administration: Consent for Use of Investigational 
New Drugs (IND) on Humans — Statement of Policy," text in Jay Katz, Experi - 
mentation with Human Beings (New York: Russell Sage Foundation, 1972) p. 
573. The same wording is reaffirmed in "Food and Drug Administration: 
Drugs for Human Use: Reorganization and Republication," Federal Register , 
(March 29, 1974), pp. 11684-11685 and 11712-11718. 



26-57 



^Department of Health, Education and Welfare, The Institutional 
Guide to PHEW Policy on Protection of Human Subjects (Washington: U.S. 
Government Printing Office, 1971), p. 8. 

^■"Protection of Human Subjects: Proposed Policy," Federal Regis - 
ter 38 (Part II, October 9, 1973. 

15"p ro tection of Human Subjects," Federal Register 39 (Part II, 
May 30, 1974. 

16it has long been recognized that it may be reasonable to persist 
in requiring that a rule such as the informed consent rule be followed 
even in individual cases where it appears that more good would come if 
the rule is violated. This is justified either on the grounds that the 
human being is sufficiently fallible that the rule is more likely to 
produce good on balance than individual judgment is or on the grounds 
that it is the nature of rules that they specify practices, practices 
which in turn might be chosen because they will produce more good than 
any other social practice. See John Rawls, "Two Concepts of Rules," 
Philosophical Review 64 (1955), pp. 3-32. 

17 See Ralph J. Alfidi, "Informed Consent: A Study of Patient Re- 
action," Journal of the American Medical Association 216 (May 24, 1971), 
pp. 1325-29, for empirical evidence. 

■•-"I recognize that the argument about consent in cases where the 
consent would do more harm than good implies that there may in fact be 
such cases. I am not prepared to concede that there are. If one recog- 
nizes that lack of consent per se may do harm--the patient may have un- 
allayed fears, confusion about what behaviors are appropriate, etc.-- 
then a case might be made that consent is always necessary on patient- 
benefitting grounds. For this discussion I presume, hypothetically, 
that consent might be contraindicated in some therapeutic experiment on 
patient-benef itting grounds. 

Charles Fried in his important new discussion of the ethical 
foundations of experimentation develops the theme of "personal care" 
as the duty of the physician. (Charles Fried, Medical Experimentation : 
Personal Integrity and Social Policy (New York: American Elsevier Pub- 
lishing Co., Inc., 1974). At one point he uses a qualified argument 
of "therapeutic privilege," that is the argument that information could 
be withheld on grounds of patient benefit (p. 22). Later, however, when 
he develops the theme of "personal care" he makes the claim that personal 
care involves a notion of rights which belong to the patient which seem 
to be independent of consequences. These rights include "a right to 
know all relevant details" (p. 101), autonomy, trust, and "the right to 
be treated without deceit or violence (p. 103). If, however, Fried per- 
ceives these to be rights inherent in personal care, it is hard to see 
how the physician has the "privilege" of overriding them when he believes 
(rightly or wrongly) that the overriding would be therapeutic. Thera- 
peutic "privilege," if it exists at all, must be precisely that, a pri- 
vilege the physician acquires because the patient has ceded the rights 
Fried has outlined. 



26-58 



19 
American Medical Association Judicial Council, Opinions and 

Reports of the Judicial Council (Chicago: A.M.A., 1971), pp. 11-12. 
Also see in addition to section 2 which commits the physician to im- 
prove medical knowledge, sections 1, 4, 9, and 10 where the physician 
is explicitly committed to serving society or other collective groups 
as well as the individual patient. 

20see Jeremy Bentham, An Introduction to the Principles of Morals 
and Legislation ; John Stuart Mill, Utilitarianism ; G.E. Moore, Principia 
Ethica , London: Cambridge University Press, 1903; and Henry Sidgwick, 
The Methods of Ethics , London: Macmillan and Co., Ltd., 1907. 

21william Harvey, Exercitatio Anatomica de Motu Cordis et Sanguinis 
in Animalibus , 1628. Also see Henry E. Sigerist, "William Harvey's Posi- 
tion in the History of European Thought," in On the History of Medicine 
(New York: MD Publications, 1960), pp. 184-192. 

22(jhauncey D. Leake, ed., Percival's Medical Ethics (Huntington, 
New York: Robert E. Krieger Publishing Co., 1975), p. 76. 

23ciaude Bernard, An Introduction to the Study of Experimental 
Med ic ine , (New York: Dover, 1957), p. 102. Bernard certainly has gone 
further than even the classical utilitarians in claiming that experiments 
which may do good are obligatory. They would not be according to the 
utilitarians unless all things considered they would be likely to do more 
good than any other courses of action. Bernard, contrary to some inter- 
pretations of the negative formulation of the physician's duty primum non 
nocere (first, do no harm) seems to treat harms and benefits on the same 
scale. 

2^This function corresponds to the point made by Katz and Capron 
that one purpose of informed consent is "to involve the public." Jay 
Katz and Alexander Morgan Capron, Catastrophic Diseases: Who Decides What? 
(New York: Russell Sage Foundation, 1975) p. 90; cf Levine, "The Nature 
and Definition," p. 3. 

25"p ro tection of Human Subjects," Federal Register 39 (Part II, May 
30, 1974), p. 18919. 

26see Carl J. Wiggers, "Human Experimentation as Exemplified by 
the Career of Dr. William Beaumont," in Clinical Investigation in Medi - 
cine: Legal, Ethical and Moral Aspects edited by Irving Ladimer and Roger 
W. Newman (Boston: Law-Medicine Research Institute, Boston University, 
1963), pp. 119-125. 

2?st. Martin bound himself to "Serve, abide and continue with the 
said William Beaumont, wherever he shall go or travel or reside in any 
part of the world his covenant servant and diligently and faithfully... 
submit to assist and promote by all means in his power such philosophical 
or medical experiments as the said William shall direct or cause to be 
made on or in the stomach of him, the said Alexis, either through and by 



26-59 



means of the aperture or opening thereto in the side of him, the said 
Alexis, or otherwise, and will obey, suffer and comply with all rea- 
sonable and proper orders of or experiments of the said William in re- 
lation thereto and in relation to the exhibiting and showing of his 
said stomach and the powers and properties thereto and of the appur- 
tenances and the powers, properties and situation and state of the 
contents thereof." Text from William Beaumont, Experiments and Obser - 
vations on the Gastric Juice and the Physiology of Digestion , 1833, 
cited in Henry Beecher, Research and the Individual: Human Studies 
(Boston: Little Brown, 1970), p. 219. 

28see Michael R. LaChat, "Utilitarian Reasoning in Nazi Medical 
Policy: Some Preliminary Investigations," Linacre Quarterly 42 (Feb. 
1975), pp. 14-37; for an important discussion of the general problems 
of utilitarian justification of human experimentation see Ruth Macklin 
and Susan Sherwin "Experimenting with Human Subjects: Philosophical 
Perspectives," Case Western Reserve Law Review 25 (1975), pp. 434-471. 

29"The experiment is to be such as to yield fruitful results for 
the good of society, unprocurable by other methods or means of study, 
and not random and unnecessary in nature." 

30 "1. The voluntary consent of the human subject is absolutely 
essential. 

This means that the person involved should have legal capacity 
to give consent; should be so situated as to be able to exercise free 
power of choice, without the intervention of any element of force, fraud, 
deceit, duress, over - reach ing , or other ulterior form of constraint or 
coercion; and should have sufficient knowledge and comprehension of the 
elements of the subject matter involved as to enable him to make an 
understanding and enlightened decision. This latter element requires 
that before the acceptance of an affirmative decision by the experi- 
mental subject there should be made known to him the nature, duration, 
and purpose of the experiment; the method and means by which it is to 
be conducted; all inconveniences and hazards reasonably to be expected; 
and the effects upon his health or person which may possibly come from 
his participation in the experiment. 

The duty and responsibility for ascertaining the quality of 
the consent rests upon each individual who initiates, directs, or en- 
gages in the experiment. It is a personal duty and responsibility which 
may not be delegated to another with impunity." In Jay Katz, Experimen - 
tation with Human Beings , op . cit . , p. 305. 

31 

See the interesting discussion in Bernard M. Dickens, "Contractual 

Aspects of Human Medical Experimentation," University of Toronto Law Journal 
25 (1975), pp. 406-438. 

32carpenter v. Blake 60 Barb. 488 (N.Y. Sup. Ct. 1871). 

33jackson v. Burnham 20 Colo. 532 Pac . 577 (1895). 



26-60 



3 ^Schloendorff v. New York Hospital 211 N.Y. 127, 129, 105 N.E. 
92, 93 (1914), text in Jay Katz, op_. cit., p. 526. 

35Natanson v. Kline 186 Kan. 393 P2d 1093 (1960) text cited in 
Jay Katz, op_. cit . , p. 533. 

36Fortner v. Koch 272 Mich. 272 N.W. 762 (1935). 

37"Protection of Human Subjects," Federal Register 39 (Part II, May 
30, 1974), p. 18917. 

38i concede that someone with imagination might argue that there 
are indirect, but serious risks--that mankind's respect for the genetically 
abnormal would change and that, in turn, would have a psychological impact 
on the subject. If those risks are included, however, it seems that any 
research would have risks and the proviso "if risk is involved" is mean- 
ingless . 

It seems more plausible to say that the subject is not really 
at risk in the normal sense of the term, but that the rights and welfare 
of others are and that that is sufficient reason why some might want to 
refuse to consent to participate in the study. 

39l endorse Robert J. Levine's emphasis on explaining the "larger 
ultimate purpose" as well as the immediate one. See Levine, "The Nature 
and Definition," p. 10. 

^See Immanuel Jakobovits, Jewish Medical Ethics (New York: Bloch 
Publishing Co., 1959), pp. 132-152; Fred Rosner, Modern Medicine and 
Jewish Law (New York: Yeshiva University Press, 1972), pp. 132-154; and 
David Bleich, "Medical Experimentation Upon Severed Organs," in his "Sur- 
vey of Recent Halakhic Periodical Literature," Tradition 12 (Summer 1971), 
pp. 89-90. 

^See L.C. Epstein and L. Lasagna, "Obtaining Informed Consent: 
Form or Substance," Archives of Internal Medicine 123 (1969), pp. 682-688. 

42xhere are a number of variants on the professional standard: 
what is customary for physicians in the community to disclose, what phy- 
sicians more generally in society or in a specialty group would disclose, 
or what the "reasonable physician" would disclose. All rely on a profes- 
sional standard. See Leonard L. Riskin, "Informed Consent: Looking for 
the Action," University of Illinois Law Forum 1975 (number 4, 1975), pp. 
580-611, especially pp. 585-586. 

43 Natanson v. Kline 186 Kan. 393 P. 2d 1093 (1960), cited in Jay 
Katz, op_. cit . , p. 534. 

^California, Idaho, New York, Ohio, Oregon, Pennsylvania, Rhode 
Island, Washington, Wisconsin, and Tennessee, but cf. Karp v. Cooley, 
349 F. Supp. 827 (S.D. Tex. 1972, affirmed 493 F. 2d 408 (5th Cir. 1974). 



26-61 



^Riskin, op_, cit . Also see Don Harper Mills, "Whither In- 
formed Consent?" Journal of the American Medical Association 229 (July 
15, 1974), pp. 305-309, where Mills concludes (p. 305) that "the 'standard 
of practice' basis for judging the extent of disclosure will probably 
give way to a new rule of reasonableness; though what the courts believe 
to be reasonable disclosure may not necessarily be consistent with what 
physicians believe should be disclosed." 

46carnham, o£. cit . , pp. 143-44. Many, including Garnham, still 
maintain that although informed consent of patient or subject is impos- 
sible, some consent should still be obtained. There seems to be an in- 
consistency in this position. 

46a of course, the same point can be made on patient- or subject- 
benefitting grounds if one holds that determining what is beneficial 
to the patient/subject is dependent upon the subject's own values. The 
fact that the researcher or the research community would find some piece 
of information irrevelant given the researcher's values or the values 
of the research community as a whole, cannot be taken to imply that it 
would be irrelevant in another value context. 

47serkey v. Anderson, 1 Cal. App. 3d 790, 805, 82 Cal. Rptr. 67, 
78 (1969). 

^xhe implications of the reasonable person court decisions for 
the composition of human experimentation committees and the questions 
they must answer is explored in greater detail in Robert M. Veatch, 
"Human Experimental Committees: Professional or Representative?" Hast - 
ings Center Report 5 (October 1975), pp. 31-40. 

49Norman Fost, "A Surrogate System for Informed Consent," Journal 
of the American Medical Association 233 (Aug. 18, 1975), pp. 800-803. 

^"This Interpretation differs slightly from that of Robert J. Levine, 
"The Nature and Definition," p. 19. He says that the reasonable person 
standard " puts the particular physician or investigator in the precarious 
position of having to know in advance what harms a particular patient or 
subject might consider material after they occur." I agree that the phy- 
sician is placed in a precarious position, but I do not think it is quite 
that precarious. My reading of the case law is that the physician must 
simply disclose what the reasonable person would find meaningful or use- 
ful. This should be modified when the physician has reason to believe 
that the individual patient or subject differs from that reasonable per- 
son view, but, unless the physician has negligently or maliciously avoided 
the discovery that the individual patient differs from the reasonable 
person, I do not see that he would be held to the standard of that (deviant) 
patient or subject. Of course, the physician is still in a precarious 
position because this series of cases makes clear that the physician's 
own judgment or even the consensus of medical professionals cannot be 



26-62 



taken to adequately predict what the reasonable person would want to 
know. This does mean, however, that an all lay committee made up of 
individuals reasonably presumed to be reasonable would be a plausible 
test of the adequacy of the information, unless there was information 
to the contrary about the individual subject. Levine goes on (p. 20) 
to introduce the reasonable person standard, but without qualifying 
it for the case when the researcher knows or should know that the 
subject differs from that reasonable person. 

51"protection of Human Subjects," Federal Register 39 (Part II, 
May 30, 1974), p. 18917. 

For a fuller discussion of these elements see the author s 
"Ethical Principles of Medical Experimentation," in Ethical and Legal 
Issues of Social Experimentation edited by Alice M. Rivlin and P. 
Michael Timpane (Washington: The Brookings Institution, 1975), pp. 
21-59, especially pp. 52-57. 

53see a fuller discussion of this issue below. 

54-The specific mention of "inconveniences" occurs in the Nurem- 
berg Code. Its omission has been taken as justifying non-disclosure 
of inconveniences by some review committee members although "incon- 
veniences" could be taken to be subsumed in "risks." That "discomforts" 
is listed as separate from "risks" can be cited to support the claim 
that "inconveniences" are not to be taken as risks. 

55"Protection of Human Subjects," Federal Register 39 (Part II, 
May 30, 1974), p. 18918, paragraph 46.9. 

56R bert J. Levine, "The Nature and Definition," pp. 10, 11, 25. 

57 Levine, ibid . , p. 57. Cf. "The Protection of Human Subjects," 
Federal Register 39 (Part II, May 30, 1974), p. 18919. 

58see Louis Lasagna, The Conflict of Interest Between Physician 
as Therapist and as Experimenter , (Philadelphia: Society for Health and 
Human Values, 1975). 

59john Stuart Mill, On Liberty (New York: Liberal Arts Press, 1956), 
p. 114, where he argues "for such actions as are prejudicial to the in- 
terests of others, the individual is accountable and may be subjected 
either to social or to legal punishment if society is of opinion that 
the one or the other is requisite for its protection." 



60 



Ibid ., p. 125. 



6lThis is the term preferred by Alexander Morgan Capron, "Legal 
Considerations Affecting Clinical Pharmacological Studies in Children," 
Clinical Research 21 (1972), pp. 141-150. 

62paul Ramsey, ojp_. cit . , chapter 1. 



26-63 



63The argument here is related to Richard McCormick's in "Proxy 
Consent in the Experimentation Situation," Perspectives in Biology and 
Medicine 18 (Autumn 1974), pp. 2-20. 

64c Emmett Raitt, "The Minor's Right to Consent to Medical Treat- 
ment: A Corollary of the Constitutional Right of Privacy," Southern 
California Law Review 48 (1975), pp. 1417-1456. 

65in Re the matter of Karen Quinlan: an alleged incompetent: Superior 
Court of New Jersey, Chancery Division, Morris County, Docket No. C-201-75; 
Winters v. Miller, 446 F. 2d 65 (C.A.2, May 26, 1971). 

6%ew York City Health and Hospitals Corporation and Edward A. 
Stolzenberg, Associate Director, Bellevue Hospital, Petitioners, v. 
Paula Stein, a patient, respondent, 335 N.Y.S. 2d 461. 

67in re appointment of a guardian of the person of Maida Yetter, 
Docket No. 1973-533 (Pa. Ct. of Common Pleas, Northampton Co. Orphan's 
Ct., June 6, 1973). 

68 N.Y. State Mental Hygiene Law, Article 15, "Rights of Patients," 
Section 15.03, Point (b)4. 

69j have heard this suggested in personal communication with Karen 
Lebacqz and, independently, by Roy Branson. Whether the system would 
work I do not know. Possibly prison officials or dominant prisoners would 
gain control of the funds in some cases leading to their inequitable use. 
Also it is not clear why drug companies and others wanting to do research 
would choose prisoners as subjects under these conditions. In order to 
get subjects in a controlled environment for long periods of time they 
might recruit students who would agree to spend weeks in a controlled 
institution in exchange for offerings of summer school courses, room and 
board at the researchers' expense. Whether an offer to an economically 
deprived group such as students would be seen as less coercive than for 
prisoners, I do not know, but at least institutional review and control 
might be more dependable and the danger of use of research for retribution 
would be eliminated. 

'^American Medical Association House of Delegates, "Resolution on 
Disapproval of Participation in Scientific Experiments by Inmates of Penal 
Institutions," text in Henry Beecher, 0£. cit . , p. 225. This position may 
also be rooted in the concern for protecting the public interest insofar 
as research participation leads to earlier release from prison. It would 
not directly justify opposition to meritorious or commendatory citation, 
however . 

71we have already argued that the positions of private groups of 
professionals should not be persuasive in setting public policy. They 
are often committed to special value stances not shared by the general 
public. Thus the acceptance of the American Psychological Association 



26-64 



of deception and even lying --presumably on the justification of the 
social benefits to be obtained—should not be terribly relevant to the 
Commission. The Association's unique commitment to the social value 
of the particular type of knowledge gained should not influence com- 
missioners who have a public obligation to protect constitutionally 
guaranteed rights. See American Psychological Association, Ethical 
Principles in the Conduct of Research With Human Participants (Wash- 
ington, A. P. A., 1973), pp. 29-35; Cf. Code of Ethics, American Socio- 
logical Association, which says obliquely, "Just as sociologists must 
not distort or manipulate truth to serve untruthful ends, so too they 
must not manipulate persons to serve their quest for truth." 

72Hans Jonas, Philosophical Essays (Englewood Cliffs, New Jersey: 
Prentice Hall, 1974), p. 126. 

73see the qualification of this statement related to a possible 
theory of justice below. 

'^1 believe this a more explicit principle and more precise re- 
quirement than Levine advocates. See Levine, "The Nature and Definition," 
p. 30. 

75 "protection of Human Subjects: Proposed Policy," Federal Register , 
August 23, 1974, pp. 30653-30654. 

'""Protection of Human Subjects: Fetuses, Pregnant Women, and In 
Vitro Fertilization," Federal Register , August 8, 1975, pp. 33526-33552. 
I also support third party scrutiny proposals as set forth by Levine, op. 
cit . , pp. 46-52. However I feel such third parties should be used only 
with the consent of the subject. Discussing the proposed research first 
with the next of kin and/or a physician not connected with the research 
is certainly a violation of the individual's right to confidentiality. 
It should be clear, however, that third parties must be independent of 
the researcher and his or her staff. Thus the debate between Don Harper 
Mills and Alan Meisel over whether the physician or nurses and physician 
assistants associated with them would be better witnesses of the consent 
may be misplaced. While certainly the physician or researcher cannot be 
an adequate witness of a consent contract between himself and the patient 
or subject, those working under his supervision would not be adequate 
either. See Alan Meisel, "Informed Consent—The Rebuttal," Journal of 
the American Medical Association 234 (Nov. 10, 1975), p. 615; and Don 
Harper Mills, "Informed Consent- -The Rejoinder," Journal of the American 
Medical Association 234 (Nov. 10, 1975), p. 616. 

''"Position Statement of the American Federation for Clinical 
Research on the DHEW Proposed Rules on Protection of Human Subjects," 
Clinical Research 23 (1975), pp. 53-60. 

'°John Rawls , A Theory of Justice , (Cambridge, Mass. : Harvard University 
Press, 1971). 



26-65 



'°See Brian Barry, The Liberal Theory of Justice (Hew York: 
Oxford University Press, 1973); and Robert M. Veatch, "What Is a 
Just Health Care Delivery?," in Ethics and Health Policy edited 
by Robert M. Veatch and Roy Branson (Cambridge, Mass.: Ballinger 
Press, forthcoming). 

80See Macklin and Sherwin, o£. cit . 



26-66 



States, of H 

I A, 



NMH 

'l|™ Amazing Research. 
Amazing Help. 

http://nihlibrary.nih.gov 



10 Center Drive 

Bethesda, MD 20892-1150 

301-496-1080 




NIH LIBRARY 



4 



20 8977 




.3 1496 00113 2367 




U.S. Department of Health, Education, and Welfare 
DHEW Publication No. (OS) 78-0014